Friday, June 28, 2013

Soon you'll be able to control your iPhone using your HEAD

Apple has added a feature to its new iOS 7 software
that lets users swipe between menus, select apps and
control the device's screen using just head movements.
When the Switch Control feature is enabled,
it will scroll through and highlight different onscreen sections
- including individual menus and apps.
When the required section is highlighted, users can move their
head to the left to select it.
Alternatively users can move their head to the right to go back
to the home screen or begin scrolling again.
The feature is called Switches and can be turned on or off in
the Switch Control menu.
Switches can be customised so that the different left and right
head movements control different menus and settings.
According to Apple fan site 9to5Mac, users can also make the
left or right head movements start voice-activation assistant
Siri, open the Notification Center and change the volume of
music, for example.
A head movement can also be used as an alternative to a
finger tap.
The full range of options that can be controlled using head
movements is in the Actions menu.
Users will also be able to change how quickly the software
scrolls between sections and whether there is a pause between
menus selections.
The tool has been launched in the second beta version of the
software, which was officially announced during Apple's
developer conference at the start of June.
Apple has made the beta releases available to developers to help
them develop apps for the platform. The software will then be
rolled out to users in October.
Other Apple hands-free technologies include its voice-activated
assistant app Siri as well as iOS in Car.
Apple plans to integrate phones with iOS 7 into the
dashboards of cars.
If a car is equipped with this technology, a user will be able to
connect their handset to the system and use their car's built-in
display and controls.
The Siri Eyes-Free tool also lets users control this system by
voice so they can keep their eyes on the road.
Samsung added eye tracking to its Galaxy S3 handset in May
2012 with the Smart Stay feature that can tell when a user is
looking at the display.
Its Samsung Galaxy S4, released in April this year, continued
this trend by adding Smart Pause, which pauses video when
the user's eyes wander from the screen.
The Samsung Galaxy S4 also has Air Gestures that can detect
when a user's finger is over the screen. It lets users swipe
menus without actually touching the display.

No comments:

Post a Comment