Monday, December 7, 2009

HOME MODIFICATIONS / EQUIPMENT REQUIRED

HOME MODIFICATIONS / EQUIPMENT REQUIRED




Last time I posted, I was still able to drive my car. My arms and hands have now grown so weak that this is no longer possible. Typing has also become very difficult, and after typing I usually have a substantial surplus of letters on my screen.

Standing, carefully, with support, is still possible. I now have a new electric wheelchair with power adjustable leg-rests, back and seat. I sit in this all day, and because it is adjustable, it is comfortable. We have widened two doorways to 34" to allow good wheelchair access. In the garage we have a ramp so that I can go outside, or access our handicapped access modified van.

I also have an electric hospital bed allowing adjustments as needed. To transfer we need a rolling lift, and the lift will not roll on carpet when it is lifting me. We have solved this problem by removing the carpet and installing laminate flooring in our bedroom. We have also installed a roll in shower in our bathroom.

I am using breathing support 24 hrs a day and I have two Synchrony bipap ventilator machines and two batteries. One is mounted, with a battery, on my electric wheelchair, and the other is used beside my bed, or at any other stationary position. The bipap machines run on both 12v and 110 ac., and can be plugged into a car cigarette lighter outlet for travel. With 2 machines there is always a backup, should one fail.
It is important to charge up all batteries, and the wheelchair overnight, so that you can sustain a power failure.

Since ALS leads to total paralysis, some foresight and planning are required. In that condition, how will you communicate anything? How will you convey even basic daily needs, how will you be able to respond to anything or one?

New technology allows you continued communication. Even with total paralysis the eyes will continue to funtion. Material below is taken from the Mytobii website, see link below.

The power of a computer at your… eyes

One could say that an eye-controlled AAC device (Alternative and Augmentative Communication) is a computer – but instead of using a mouse and keyboard to control it, you use your eyes. Most eye-controlled devices are non-intrusive, meaning that you don’t have to wear or hold anything - you simply position yourself in front of the screen and look at it.

The device keeps track of where you are looking by sending out near-infrared light and measuring it’s reflection in the eyes. To click, you either stare at a point, blink, or use a switch.


More than meets the eye

You can communicate your thoughts, ideas and wishes by typing text or using symbols that can be turned into speech, sms messages, e-mails etc. Since you are looking at a computer screen, the text or symbols that you use can be changed dynamically to suit different situations. For example, you could have dynamic displays with one layout or communication page for use at home, another one at the store or in school and yet another in hospital, making communication much easier.

Eye-controlled devices are about more than typing a message. Most devices are built on a Windows platform which makes it possible to play games, watch videos, control the TV and other devices, and access the internet etc. - just what you’d expect from a regular computer.

You control your device… not the other way around

Controlling a device with your eyes is not something that is new. The technology has been around for a while but at Tobii we have refined it to fit the various needs of people with communication and physical disabilities. Superior quality, ease of use and reliability are hallmarks of MyTobii devices.

One thing that sets our products apart from others is the easy-to-use calibration and reliable eye tracking. Most users require less than ten seconds to calibrate the system, and once calibrated the systems is extremely stable. The My Tobii products operate accurately, regardless of large head movements, glasses, eye color or light conditions.





MyTobii P10

MyTobii P10 is a portable eye-controlled communication device. Everything, including a 15” screen, eye control device and computer, is integrated into one unit. Just connect to a power source, such as a wall socket, power wheelchair or separate battery. The device can be mounted for use at a desk, on a wheelchair, in bed or anywhere suitable for the user.

Who is helped by MyTobii?

Link to MyTobii

Users with disabilities such as:

* Cerebral Palsy
* ALS (Amyotrophic Lateral Sclerosis)
* High level spinal injuries
* Multiple Sclerosis

With powerful, state-of-the-art features, MyTobii brings eye control to a whole new audience of users. The interface can be configured for many different skill levels thus being flexible enough to serve many different user groups.
Dependable and easy operation

* Easy to set up
* Fully automatic
* Quick, easy one-time calibration
* Mounts to desk, bed or wheelchair
* Easy to adjust to the needs of different user groups


The eye tracker doesn't need the user to “do” or “wear” anything. Simply sit in front of it, follow a dot during a 10-second calibration and you are on your way! With flexible mounting, easy customization and different hardware options, every situation is catered to.
Reliable tracking

* No undue restraint on head movements
* Tracks nearly everyone, even those with glasses or contacts
* Works in most lighting conditions
* Compensates for large head movements without errors
* Very high accuracy

MyTobii can be relied upon to keep working. MyTobii will operate accurately regardless of glasses, contacts, and eye color or light conditions. With MyTobii, those with uncontrolled head movements can use eye control for the first time. Tobii's unique ability to deal with large head movements opens up eye control to user groups who have been unable to use other systems.
Robust design

* Eye tracking components fully integrated
* Robust casings will survive real-world use
* New portable unit with computer processing built in

Tobii Technology's world-leading eye tracking techniques and an integrated, tough design make the MyTobii eye tracking system extremely accurate, portable and tough enough to deal with real-life. No fragile external cameras or lighting units mean your system will be able to go everywhere with you. High accuracy and powerful interaction techniques mean MyTobii will also keep up with you.


Tobii Eye Tracking technology

Tobii’s eye tracking technology utilizes advanced image processing of a person’s face, eyes and reflections in the eyes of near-infrared reference lights to accurately estimate:

* the 3D position in space of each eye
* the precise target to which each eye gaze is directed towards

Key advantages

Tobii has taken eye tracking technology a significant step forward through a number of key innovations. Key advantages of Tobii’s eye tracking technology are:

* Fully automatic eye tracking
* High tracking accuracy
* Ability to track nearly all people
* Completely non-intrusive
* Good tolerance of head-motion

Wednesday, October 14, 2009

Keyboard help for ALS patients

Since I can no longer type, I am now using this (slow) system.
RECOMMENDED!




Click-N-Type is an on-screen virtual keyboard designed for anyone with a disability that prevents him or her from typing on a physical computer keyboard.

As long as the physically challenged person can control a mouse, trackball, touch screen or other pointing device, this software keyboard allows you to send keystrokes to virtually any Windows application or DOS application that can run within a window.

The Click-N-Type Virtual Keyboard is a 32 bit application that requires Windows 95/98/ME/NT/2000/XP/Vista or later. There are other onscreen virtual keyboards around but you'll find Click-N-Type the easiest to use for getting text into those uncooperative places like browser URL "Address:" fields, Email "To:" addresses, Email "Subject:" fields, dialog boxes like "Open" and "Save As...", and many other problematic applications. Try them all. You'll see they all work fine while typing into Notepad or WordPad, but when you attempt to do some real work, with all but the expensive ones, you'll get really annoyed really fast.

The Click-N-Type Soft Keyboard was designed with ease of use foremost in mind. Oh yes, it's FREE. If you need it, you can have it. I've seen too many people trying to make money off disabled people. Of course, if you'd like to drop Bridget a line at Click-N-Type@Lakefolks.com with any comments, I guess I couldn't stop you. If you do, we can keep you informed of any fixes and/or future enhancements.

To download: http://cnt.lakefolks.com/

Thursday, October 1, 2009

Panasonic Shows Robotic Bed That Becomes Wheelchair

Panasonic has developed a robotic bed that transforms into a wheelchair at the command of the user. It's designed for people who have limited mobility and is intended to provide an extra level of independence.

When lying in the bed the user can summon the robot's computer system by simply calling out "robotic bed." In a demonstration at the International Home Care and Rehabilitation Exhibition in Tokyo this week, that elicited the answer,"Yes, what may I do for you."

Users can demand that the robotic mechanism lower or raise the head or foot of the bed or make the complete transformation into a wheelchair.

When the user asks for the wheelchair, the sides of the mattress, which is divided into several pieces, move away so the resulting chair is narrower than the entire bed. The user's back and head are raised as are the feet and the central part of the bed gently slides out to the side. The feet then fall and the back and head rise some more so that the user ends up in a comfortable sitting position.

The resulting wheelchair resembles the sort of seat you might see in business class on an aircraft and is fully robotic itself. A joystick on the right armrest can be used to control the wheelchair's movement.

"It took us a year from the start of development to showing it here today," said Yukio Honda, a visiting professor from Osaka Electro-Communication University who is working at Panasonic's Robot Development Center.

Before following each voice command the wheelchair checks to make sure the action that it understood is indeed the one desired. The user must affirm with a "yes" before the bed carries out the command.

In the wheelchair mode it can also detect people and obstacles in its way to safely guide the user around them.

In building the bed Panasonic sought to fit as much IT into it as possible.

An LCD touch-panel is located on a canopy that sits above the bed and means the user can watch TV, connect to the Internet, check home security cameras or make video calls from the bed.

"Even if you are lying in bed you can keep in touch with your family in another room," said Honda.

The wheelchair is capable of automatically docking with the bed. It just needs to be brought close and a button pushed for the procedure to begin.

Panasonic is planning to test the bed in several care homes both in Japan and overseas but there remains a legal obstacle before it can become a product. Safety standards and laws concerning home-help robots are yet to be considered in many countries so until they are codified, and manufacturer liability limited, the bed will likely remain off the market.


http://www.pcworld.com/article/172943/panasonic_shows_robotic_bed_that_becomes_wheelchair.html?tk=rss_news

Wednesday, July 22, 2009

Cutting edge new developments for ALS patients

The Audeo is being developed to create a human-computer interface for communication. When a person intends to speak their brain sends muscle instructions in the form of electrical signals through the nervous system. These electrical signals stimulate the muscles to, under normal circumstances, produce the desired speech. In many cases however, disease or disability can prevent the muscles from responding to this stimulation. The Audeo gets around this by directly utilizing the electrical activity itself, which even in severe cases can still be present.

The "AUDEO" will give me
the thing I need more than
anything else; the ability
to talk to my children.

Dave (age 41, diagnosed with ALS in 2004)

The Audeo sensor is a highly integrated, extremely sensitive sensor of neurological activity. It noninvasively detects electrical activity at the surface of the skin. The Audeo actively processes this activity to make it a robust control signal. This control signal can then be connected to a speech engine in various ways to give back to a person, the ability to communicate.

Current Applications of The Audeo:

The AudeoBasic for Communication– We have developed a method for individuals who have lost the ability to speak to be able to communicate again by using the neurological signal from the brain to control a speech generation software.

Future Possible Applications of The Audeo:

Wheelchair Control – By incorporating the Audeo with additional hardware, we have successfully controlled a wheelchair without the need of physical movement. To see the wheelchair in action, watch the wheelchair demonstration. wheelchair demonstration

see also: http://www.theaudeo.com/tech.html

ANOTHER, RELATED ARTICLE, IN DE SPIEGEL, GERMANY ON AUGUST 27, 2009:

08/26/2009

Playing With Your Head
The Dawning Age of Mind-Reading Machines

By Hilmar Schmundt

Imagine controlling machines, typing text or juggling balls using nothing but the power of thought. What sounds like far-fetched science fiction is gradually becoming possible, providing hope for disabled patients -- and new gimmicks for the computer gaming industry.

DIZ SENTENS IS WRUTEN WID TAUGHTS. No keyboard, no hands, no blinking even. I think, therefore I write.

My original plan was to write this article with nothing but the power of thought, but the technology of transforming ideas into characters is still crude and prone to error. The first word alone took a few minutes, and even after that the result was still "diz" instead of "this."

Still, that little sentence is like a little miracle. The old dream of mind-reading is slowly becoming reality -- though this time around it is the product of machines rather than the minds of fiction writers.

"The advances are tremendous," says Christoph Guger, the developer of a brain-reading system. "In the past, you would have had to train for days. Today, entering text takes only a few minutes."

Guger is an engineer and a businessman. But with his hair falling past his jacket's collar, he looks the part of a start-up entrepreneur. Still, he is certainly not new to the business. His company, Guger Technologies, which is based in the Austrian city of Graz, has been a supplier to countless brain-research laboratories for years. In addition to scalpels and medications, though, Guger also sells thought-transport technology.

Guger recently presented his latest thought-reading system at a workshop entitled "Brain-Computer Interfaces" held at Berlin's Charité, one of Europe's largest university hospitals. The new electronic interfaces between brain and computer are referred to as BCI.

Hardware and Wetware

The goal of BCI is to enable the user to use thoughts -- instead of a keyboard, mouse or touch screen -- to control a computer's actions. But what sounds like telepathy is, in fact, quite banal. First the user puts on a device that looks like a bathing cap. Then an electrically conductive gel is squeezed through small holes in the cap onto the scalp. Finally, eight electrodes are plugged into the cap, and a colorful array of dreadlock-resembling cables are attached to the electrodes. The cables are then connected to the computer through a signal amplifying interface. So much for the hardware.

Then the "wetware" -- the term IT researchers sometimes use to refer to the brain -- takes over. An alphabet flickers across a screen in front of the subject, and the letters light up one at a time. The user waits until the letter he or she wants to use appears. When it lights up, the brain has an involuntary reaction that produces a small "electric potential," a tiny increase in voltage of about 15 microvolts, which is 100,000 times weaker than the voltage generated by a flashlight battery.

The principle is based on the tried-and-tested EEG, or electroencephalogram. The brain's small gray cells fire off electrical signals, which can be measured on the surface of the scalp. Guger's special method is called P300, a reference to the sudden fluctuation in voltage he is looking for, which appears in the visual cortex of the brain 300 milliseconds after each expected letter flashes on the screen.

This method is very rudimentary in the sense that it doesn't really read a thought but, rather, merely the average activity of millions of neurons. Likewise, those neurons may not be reacting only to letters, but also to someone else's sneeze or to tightness in the subject's left shoe. Guger's current task is to filter all of these interfering signals out of the chaotic flow of thoughts in the brain.

Brain Caps for Pinball

As challenging as this task might be, there was still a lot of excitement at the Berlin workshop. The project involves a rapidly growing group of researchers hoping to capture thoughts with what can best be described as an electronic camera. "Ten years ago, there were perhaps a dozen research groups in this field," says Klaus-Robert Müller, the director of the Machine Learning/Intelligent Data Analysis Group at the Technical University of Berlin. "Now there are more than 200."

Müller has also developed a BCI that works in a similar way to Guger's mental typewriter. His system isn't based on tediously poking around in a jumble of letters but, rather, on lightning-fast reactions. To develop the system, Müller has his subjects use their brain caps to play pinball.

When they think "right," the lever on the right side pops up. As if moved by the hand of a ghost, it flings the ball back into the game without anyone having touched the controls.

Müller is working closely with John-Dylan Haynes, one of the stars of the elite thought-reading community. Haynes attracted attention last year when he reported that, using a magnetic resonance imaging (MRI) machine, he had correctly guessed the decisions of his subjects before they were able to act in more than half of the cases. A full seven seconds before they moved a finger, Haynes could see that they were planning to press a specific button.

After Müller's presentation, Niels Birbaumer stood up to speak at the Berlin conference. Birbaumer is the director of the Institute of Medical Psychology and Behavioral Neurobiology at the University of Tübingen in southwestern Germany and is considered a pioneer in the field. For years, he has been trying to teach people with physical handicaps to control their wheelchairs or prosthetic limbs using only the power of thought. "Our successes are still modest," he says, "but I'm already totally crazy with hope." He expects that within two years he will be able to use his system with locked-in patients -- that is, paralyzed individuals who are fully aware of their surroundings but can move nothing other than their eyes -- for the first time within two years.

Communication with these types of patients is something of a Holy Grail in the profession because it promises to make it possible to help locked-in people like the lead character in the 2007 film "The Diving Bell and the Butterfly." In the film, an almost totally paralyzed patient dictates his memories to his therapist using the only means of communication he has left: blinking.

Mindreading Machines

By Hilmar Schmundt

Part 2: Eavesdropping on Dreams

Completely locked-in patients, on the other hand, can't even blink. All they can still control is their neurons.

The prospect of getting a glimpse of those thoughts is by no means hopeless. Birbaumer has already demonstrated that the brain waves of locked-in patients respond to various pieces of music, familiar faces and grammatical errors.

The BCI experts now believe that their technology is on the verge of a major breakthrough. Their successes are indeed astonishing, but they still have a long way to go before realizing their ambitions. Last fall, Japanese researchers reported in the journal Neuron that -- with the help of a technology known as functional magnetic resonance tomography -- they had observed the brain processing certain images. This promptly led them to speculate that one day it might even be possible to eavesdrop on the "illusions and dreams" of the brain.

In an atmosphere in which everything seems possible, though, there is a great temptation to hope that the power of thought can somehow make the pitfalls of technology magically disappear. But we still haven't reach a point yet where we can control things with thoughts alone.

For example, for many years now, the American firm Emotiv has advertised a system that allows paralyzed people to control their wheelchairs. But neither emotions nor thoughts are involved. Instead, Emotiv's technology is based primarily on signals produced by facial muscles. It has everything to do with smiling and blinking -- and nothing to do with controlling with your thoughts.

But what happens if the day comes when we actually are able to drive cars -- or even fly fighter jets -- using our thoughts alone? The US Department of Defense finds this vision so promising that it has already invested $4 million (€2.8 million) to develop a certain kind of telepathy. The goal of the project -- dubbed "Silent Talk" -- is to enable soldiers to communicate with each other "on the battlefield without the use of vocalized speech through analyis of neural signals."

Opinions vary on how much of this research is science and how much is science fiction. A report by the MITRE Corporation, a consulting firm headquartered in Virginia, derisively describes brain control as a crude technology. According to the report, the problem is "in part due to the early stage of development of the associated technologies, and in part due to limited understanding of the central nervous system." The report soberly concludes that "the possibility for using such brain control in a military scenario is not readily apparent."

Even the prophets of the new era cannot deny that many systems are highly prone to error. In most cases, the detection rate is around 70 percent. In addition to the cable cap, the conductive gel and a computer, users also need a great deal of patience. Another problem is that about 30 percent of subjects have proven to be "EEG-illiterate." In other words, their brains remain impervious to the machines.

All too often, exorbitant promises are associated with this visionary technology. For example, in the game "Mindball," two players wearing EEG headbands compete by becoming as relaxed as possible. The player who is better able to relax his brain -- and thereby occasion uniform vibrations -- can drive a ball the farthest onto his opponent's side of the game table.

According to the companies that sell the game, it can provide significant benefits to players. They cite a study "conducted by London's prestigious Imperial College demonstrated that EEG feedback can improve academic performance and creativity."

Electrifying Audiences

The distinctions between science, art and slapstick are often vague, especially when people do things like artist Adi Hoesle, who produces "EEG sculptures" or sells colorful swirls on a canvas while claiming that they are images of his thoughts. One of his works is titled: "I'm so surprised by the red in my head."

Meanwhile, an orchestra in which some of the instruments are brain-controlled is performing in small theaters around Europe. Its musicians wear gimmicky, brightly colored EEG caps on stage.

Such theatrical effects are part of a long tradition, a sort of colorful flipside of science. For example, when so-called "natural philosophers" were studying electricity around 1750, they developed parlor games, such as producing sparks when a couple kissed. They also hoped to use therapeutic bursts of electricity to help lame people walk again. Their audiences were -- literally -- electrified.

Even a company like Mattel, known mainly for its Barbie Doll products, has now discovered the allure of brain control. The new system Mattel is introducing at computer trade shows is called "Mindflex." According to the company's fact sheet: "A true mental marathon, Mindflex exercises the brain in an entirely new way as players learn to continuously control their brain activity."

So, you ask, how does it work? To train the brain, the user puts on a headband with sensors at the temples and a cable connected to something that looks like a mini miniature golf course. Then the user tries to master the first task: balancing a small ball above an air current, causing it to levitate and making it pass through a plastic ring.

A cluster of curious onlookers has formed at the trade show. The players are doing their best not to get nervous, collect their thoughts and concentrate on the ball. Sure enough, the more they are able to descend into mental nothingness, the higher the ball hovers in the air.

Mattel refuses to divulge how the device works. Experts assume that the headband -- like the sensors used in Mindball -- measures alpha waves that pulse through the cerebral cortex at about 10 times a second when a person relaxes.

In any event, playing these thought-controlled games produces an indescribable sensation. It's as if you were using a new muscle that you had only heard about but never experienced -- the organ underneath the top of the skull. Astonished audiences must have felt a similar sensation in the 18th century, when a kiss produced a spark.

It took another 200 years before it was discovered that what was once a cheap party trick could have many other uses and that the science on which it is based keeps the Internet, the stock markets, the global economy -- and thoughts -- running today.

Translated from the German by Christopher

source: http://www.spiegel.de/international/zeitgeist/0,1518,644296,00.html