Think of sight kind of like hearing and sonar, but confined to a cone of detection that is almost 180 degrees wide, and maybe about 45 degrees up, and 45 degrees down. This field of vision is incredibly detailed at the center of focus, which is a spot around 6 degrees wide. Detail drops away rather quickly, the further away from your center of vision it gets. The visual information isn’t missing, it’s just not coming in at the same fidelity, like background conversation when you’re listening to one person. You can get more information by shifting your gaze/focus as needed.
Imagine reading braille, but the pad of your finger is just a patch of skin in the middle of a large field of skin, like on your back. In order to read, you’d have to stick the entire page to your back, and then move that center of focus all over the page. You’d feel the braille all over your back and recognize it as such, but you’d only be reading that one spot. That’s roughly what shifting your gaze is like.
Light is like loudness, in that too much of it hurts, and wipes out detail. When there is very little light, detail gets wiped out as well.
Think of the visible light spectrum like the notes of the musical scale, where the note F always sounds like an F, regardless of the octave. The colors do seem related to each other in a similar manner as musical notes; orange does look like it fits between red and yellow, the way that C sharp/D flat fits between C and D.
The “like background conversation if you are listening to one person”-part is what sells it to me!
The analogy works to a certain extent, with one lopsided difference between active listening and active viewing. With hearing, you could theoretically pay attention to one voice emanating from any direction, without repositioning yourself. You probably would turn towards the voice to optimize clarity, but it’s not a requirement.
With active viewing, you have to point your eyes directly at the item of interest. That six degree area of visual focus corresponds with visual receptor cells densely packed in one spot on the retina called the macula. The density of cone receptors falls off the further away you get from the macula.
Think of following that one conversation in a crowd, but with a directional microphone. That would give some sense of the manual activity that goes along with vision, to maintain reliable and current information about the visual environment.
Yeah, you are absolutely right. Man, you have a way with words! :) I am able to hear and I still can feel your description!
Seeing is to the eyes what hearing is to the ears. Just as you can hear sounds, tones, and voices that tell you about the world around you, seeing allows people to perceive light, shapes, colors, and movements. Imagine being able to ‘feel’ everything around you without touching it, from a distance. It’s like sensing the presence, shape, and texture of objects, but from afar and all at once. Colors, which are a significant aspect of vision, can be likened to different tones or pitches in sounds. Just as a high note feels different from a low note, different colors have their own ‘feel’ visually. Overall, seeing is a way of sensing and understanding the environment from a distance, much like how you can hear someone talking from the other side of a room
There’s only one octave with the colors so it kind of seems more like flavors. It’s less of points along a line as it is like peanut butter vs jelly vs broccoli
Why would you say there’s only one octave?
Human audible frequencies are in the range of 20 Hz to 20 kHz, and are logarithmic.
Human visible frequencies are in the range of 400 THz to 800 THz, and are linear.
There’s far more available distinction to be made with color than with sound, it just doesn’t interfere the same way.
An octave is a doubling of frequency. 400 to 800 THz is one octave. Color has one octave.
The way you know that is you don’t experience redness when absorbing ultraviolet light, and you don’t experience blueness when absorbing inferred light.
It doesn’t “loop around” like the A note does at 440 Hz, 880 Hz, etc.
An octave is when a doubling of frequency leads to a new waveform that stimulates the same set of neurons as the frequency an octave below it.
In the same way that sandwich would taste like garbage, mixing all the colors results in “brown” which visually is sort of boring, non-differentiated color. Kind of like the way sounds are drowned out by being near lots of loud heavy equipment, brown tends to reduce the features of things, make them less distinguishable, makes them capture attention less.
If there’s nothing between you and an object you can feel it at a distance. Texture is a little dulled, and some textures are easier to feel than others, but there’s also a whole second kind of texture that we call color. As light gets dimmer it gets harder to feel the difference between those textures, and it gets harder to feel the distance to things, until there is nothing left but a single all encompassing flat texture at a single unknowable distance which we call dark.
Also, some objects only partially block your ability to feel what’s behind them, and things like windows are designed to be so easy to feel through that it’s hard to feel them at all. Unless they get dirty. Then you can feel the dirt on them.
I find this somewhat sad but also quite beautiful. Those with sight often feel bad for the blind, as they miss out on much of the world we see, but simultaneously it appears as though the blind experience a world of its own beauty that those with vision could never feel or imagine. I don’t often pay mind to textures or feel objects that are out of reach. If you and I are standing in front of a waterfall, I suppose everything is still there for you except for how it looks - so who am I to determine that what you’re seeing in your minds eye is any less spectacular? I can say with certainty that when I’m standing in the middle of a deep forest, the way it looks is an afterthought when compared to what it makes me feel. Maybe both worlds are equally beautiful.
I assume that, over time, you’ve memorized where everything in your living space is. You have some idea of what shape the space around you takes.
Seeing is knowing what shape a space takes without trial and error. The depth of a room or the texture of a couch. Knowing where an item is without having to touch it, or be told where it is.
How it feels… it feels safe. Seeing makes me feel safer. That’s the only word that comes to mind.
Like hearing but with color.
No, seriously, it’s impossible to accurately convey. You can talk about the mechanics, the use cases, what you can do with it, but you cannot convey “how it is”, same as a bat cannot convey “how sonar is”.
That’s exactly the point. Nothing you say will have a meaning over what that person has experienced. You can’t really convey what seeing means, and the “but with color” was meant to show exactly that.
You can explain the technicalities, but that’s trivial enough that there’s no point in explaining it.
Imagine explaining what it feels like to e.g. drive a really fast car to someone who has never been in a car. Yeah, you can say “it’s really fast” or “it’s exciting/fear-inducing” or “acceleration pushes you into the seat”, but nothing you say can actually convey the feeling.
Same with seeing. You can say, “with my eyes I can differentiate objects over long distances”, but I am pretty sure every blind person already knows that. You can say, “Different things have different colors”, but also they know that, but at the same time it has no meaning to them.
But then try to explain a beautiful sunrise or a moving painting or something else that’s evokes emotions and then it falls apart, because you cannot convey that.
It really is, which is why I find videos like “Kids explain color to blind people” (which is an actual video that exists btw, it’s the dumbest thing I’ve ever seen out of a clickbaity video trying to be smart, and one kid is even like “so you know what an apple looks like right?” or something) so dumb.