Feeds:
Posts
Comments

Archive for the ‘brain’ Category

About this video: This is only a “remix” version I found online of the original Ryan animation.



According to Allen Snyder of the University of Sydney and director of Centre for the Mind:

“All people have [these] latent super abilities, but only some are able to express them through “malfunctions” of overriding brain functions.”

How many “genius ideas” are often brought to light due to what normally might be considered abnormal. Also, like Ryan (the youtube video), what effects might drugs have on producing “genius ideas?” (more…)

Read Full Post »

Gender Studies: 1 of 2

Anthropologist Helen Fisher’s talk at TED on gender was recently reposted, and Shruts asked me what I thought. Are gender differences as significant as she says? Are women inherently better communicators? Are men naturally better at focusing on one concrete thing as opposed to integrating a web of ideas? Would we be better suited for writing code and hunting elk while women write books and engage in verbal diplomacy? (more…)

Read Full Post »

You may have thought Madison Ave. was already making a killing by playing off of depression. Products from Viagra to Porsches to Banana Republic seem to poise themselves toward filling the gaping hole in our lives.

But a recent study in the Journal of Consumer Research confirms that not only does low self esteem increase materialism, but materialism helps create low self esteem. Researchers found that children and adolescents who were discouraged were increasingly materialistic, and that children given even the most modest tokens of encouragement became less materialistic as time went on.

The researchers have a legitimate beef with marketers who exploit an unfortunate emotional circumstance to hawk expensive clothes, luxury goods, and a fancy new phone I’m resisting. But it would be wrong to dismiss low self esteem outright. Depression after all has survived thousands of years of evolution, so L. Ron Hubbard must have given it to us for a reason.

(more…)

Read Full Post »

While writing this post, it just so happened to be 11:11, and I made my wish. I visualized it – saw it at the forefront of my mind and hoped and prayed it would be granted true. What if, someone with the right technology was able to predict exactly what it is you were thinking of? Exactly what it is you want and see in your mind?

Now, now…Hold your horses….that’s worlds away…..but not Universes.

What researchers HAVE accomplished thus far is an MRI scanner that can utilize neuronal activity in the visual cortex of the brain to predict what is being seen! Wired states the following:

One day it may even be possible to reconstruct the visual content of dreams,” Gallant said. After that, the decoding model could be harnessed for more visionary purposes: early warning systems for neurological diseases or interfaces that allow paralyzed people to engage with the world. (more…)

Read Full Post »

Brain researchers at the University of Leicester (pronounced “Lester” ’cause they’re British) have uncovered a way to identify almost exactly what a person is looking at or thinking about based on how their neurons fire.

Dr. Quian Quiroga used photos to prompt brain activity in test subjects implanted with intracranial electrodes. “In these experiments we presented a large database of pictures, and discovered that we can predict what picture the subject is seeing far above chance. For example, if the ‘Jennifer Aniston neuron’ increases its firing then we can predict that the subject is seeing Jennifer Aniston.”

“So, in simple words, we can read the human thought from the neuronal activity.”

Dr Quiroga and his team of altruistic researchers envision a world where this discovery could be used to help paralyzed patients – for instance, thinking about a cup of tea could prompt a bionic limb to reach for the item for you. Literal Dr. X-style brain power.

Alternately, it could help marketers know every time you are thinking about Jennifer Aniston.

Read Full Post »

Things have been busy for the last couple of weeks, but I had to come out of my self-imposed silence for this one.

According to the BBC neuroscientists have developed a software which, when combined with neural electrodes, can literally let your brain do the talking.

After a car crash that locked Eric Ramsey in a conscious but paralyzed state eight years ago, scientists began researching his brain. They started looking at the areas of his brain involved in speech and attempting to interpret the impulses into literal words. They think they’ve found the formula, and for the next few weeks they’re going to try to coax his brain into having conversations, which would be a huge leap forward in our understanding of how the brain functions.

This is still a ways from literally reading minds. It’s similar to Christopher Reeves’ using his muscles to create speech – only in this case, the brain is the muscle. That means Eric will be in control of what words he shares with the community and what he keeps to himself (though he still had better not let his internal monologue get too vivid).

It is also unidirectional. That is, the words can come out, but he still has to use his ears and eyes to understand other people’s words. Keep checking back though – eventually they have to come up with the technology to replace this human shell entirely.

Read Full Post »


Are you ready for this? CNET recently previewed an MIT report claiming to have found the most universal-to-date computer algorithm for translating brain activity into nonhuman physical action – i.e., Brain Powered Super Robots!! While technology already exists that takes advantage of reading brain patterns via EEG or optical imaging, each form of technology has its own language for interpreting brain activity into action. MIT’s recent calculations are something like a Linux approach, uniting all of these technologies under one mathematical language.

If we are reading this right, this could mean a path towards standardization of a rudimentary brain/computer language. Protect your children!

Read Full Post »

In the spirit of “Eternal Sunshine” Christopher deCharms runs Omneuro, a California startup practice that uses brain scanning to help patients deal with chronic pain, as well as psychological conditions like addiction and depression. And much like the fictional Lacuna, Inc. it uses brain imaging to identify the location of activity when the pain is being experienced; the patient is guided to channel his or her mental energy to dissipate the activity in that region. It’s no surprise that other startups are entering the scene using brain imaging for other purposes, for example, to identify when someone may be lying, or how a consumer responds to a marketing message. What excites the owners of these companies (besides the profit potential) is how the technology will empower people to have a better relationship with their own brains and thoughts. But if we learn to truly lasso the lightning rod will we actually want the control?

Read Full Post »

As all five of you may have noticed, I’ve been blogging for a little while on the concept of “convergence” – the growing integration of technology and human systems. (I don’t know whether this is the correct term for the phenomenon, but I’m presumptively stealing it as a layman. Ray Kurzweil probably has a better one.). I started the thread out with just a few posts on stuff like Artificial Intelligence and computers merging with the body and mind, but I’ve decided to expand to talk a little about technological interfaces in the community. So please bear with me as I begin to write more regularly about these trends rather than writing about….well, nothing I guess.
So I was procrastinating on the CNET gadget blog and read an post by Don Reisinger hating on the mouse. He started out just complaining about his Apple Mighty Mouse (I can’t tell what the problem is with the Mighty Mouse. The sensor right click and 2D scrolling trackball capabilities are pretty nice advances for their time if you ask me.). But then he challenged the mouse-computer interface altogether. His only suggested future solution to the mouse problem was the touch screen, which Apple is already hinting at in some of their recent patent submissions. But it got me thinking: what human-computer interfaces can we expect in the future?

Touch screen has been trashed by some of the lay-ergonomically-concerned who feel that tactile interaction (the springing back of the mouse or keyboard button) is a crucial to ease of use. However, having the screen, mouse, and keyboard all-in-one eliminates the whole challenge of hand-eye coordination. (All those years of typing class, gone to waste).

Then we’ve got a comment from logan1337 who predicts a future of eye-tracking cameras – cameras that sense where the pupil is looking, and direct actions based on their movement. The camera sensor thing isn’t new; that’s how microsoft’s multitouch screen coffee table works. And the eye-tracking thing may be adopted by ATM companies seeking an end to shoulder surfing (sorry drunken frat boys; you’ll have to concentrate to enter that PIN properrrrrrly). But questions arise: how do you “click” with your eye? Stanford researchers suggest either “dwelling” on the chosen key, or clicking a space bar with your hand, both of which seem more complicated than the single interface, immediate click of the traditional hand methods. The idea is to make it easier.

Okay, so let’s move a little bit further: we’ve got the Philip K. Dick by way of Tom Cruise envisioning of a multitouch glove with a massive transparent LED, suggested in Minority Report by way of Felipe M. G. Ceotto. This is good, certainly visually stunning if you’ve seen the movie, the way Tom Cruise picks up full windows and tosses them back and forth. Also it uses four whole buttons, two on each glove; if you’ve got all your fingers you could feasibly have ten different functions or alts literally in your hands – forget about all those apple hot keys! This vision may just be close to perfection. But what’s missing? You still can’t bring the window to you. Look at Tom Cruise. He’s standing. Can you imagine standing all day to do an Excel-based budget? Sure, you could make a pretty big spreadsheet, but you’d be on your feet all day! That may be okay for your average Heaven’s Gate Member Scientologist, but us lesser beings are a little bit more sensitive.

Enter another concept: the holographic touch screen. This technology already exists, even though it’s probably got more bugs than the first iPhone betas. But certainly one could imagine a future in which you could set up your holographic projector in one corner of the room and “grab” and “drag” the ethereal screen anywhere in the room. Tired of sitting at your desk? Take it to the couch. Don’t like the right in front of your eyes interaction? Keystone the screen so it appears above your lap. Screen too small? Stretch it so it’s just as big as that Tom Cruise one. Okay, so this still doesn’t solve the problem of tactile sensitivity. But it may be cool enough that our bodies will be willing to sacrifice the interactivity. And you’re not limited to your fingers either.

The ultimate? If you’ve read any of my previous posts on convergence you can guess that I would predict a day where the computer and the mind are seamless. The neuron and the computer chip will be able to share data directly. All your work can be done within the brain, manipulating data, images, and sounds within a mental interface (hopefully less feeble than it is currently). And when you want to share work with others just, I don’t know, mindbeam it to them I guess, if your brain has been enabled with WiFi 802.11n capabilities. Or is on the EDGE network. Hopefully our robot masters at VerizoSprinAT&T will have figured it out by then.

The possibilities are endless: having a mental iPod; downloading textbooks worth of information into your brain; sharing ideas without speaking; backing up your entire brain to a 500 petabyte hard drive. I mean sure, it will require a huge social change. Forget wire tapping. Heck, forget the Matrix. In a wireless mental computer world privacy will mean nothing – what kind of firewalls can you put around your memories? Everything in your mind would have to be just as accessible as everybody else’s brains are to you. Can you imagine the chaos? Remember Mel Gibson in “What Women Want”? Picture that going both ways. The upside: we might be permitted to understand what goes on in George W. Bush’s head (my guess: a constant double feature of Dennis Madalone’s “America, We Stand As One” video back to back with “Red Dawn“).

But seriously, the above suggestion of a mentally interfacing, biopowered, ethereally networked world might not be completely off the wall. Some scientists suggest that many animals have a kind of ESP that negates the need for physical communication. In early evolution, prior to the creation of language, humans might have had this capability. Should technology step up to the plate we’d essentially be creating a sophisticated version of what mother nature might have originally given us.

And we wouldn’t need that stupid trackball.

Did I hit all the potential interface methods out there? Tell me.

Read Full Post »

Just want to give props to Radio Lab who recently did a great show on memory, and human control of the brain mechanisms that control making and recalling memories. Among the secrets they uncover, it turns out that the more you recall something – an event, a face, a story – the less real it becomes! Like tellers of tall tales, the brain constantly recreates events as they are remembered, adding and subtracting details randomly. Could you imagine if computers were that faulty?

Anyway, listen to the story here.

Read Full Post »

Older Posts »

Design a site like this with WordPress.com
Get started