Sensors in Smartphones: Beyond Landscape and Portrait Screen Orientation

April 9, 2012
Well over 80 percent of smartphones today include one or more inertial sensors. However, the only sensor-enabled feature ubiquitously adopted so far is managing screen orientation: landscape or portrait.

Analysts project the smartphone and tablet industry will soon consume over $2 billion of sensors annually. Yet, for all that, the top mobile apps rarely involve sensors. App developers say using sensors is hard. And it is hard, because sensors present metrological measurements of their physical environment but, without proper perspectives and interpretation, those measurements are often meaningless.

Early industry efforts using rudimentary sample codes available from some sensor manufacturers have been unsatisfactory. For example, users have complained that the heading information given by electronic compass apps in many smartphones can be off by over sixty degrees, making the phones only marginally useful at best as navigation aids.

Today, sensor manufacturers have realized that algorithms and software are essential elements of their product offerings. Independent middleware developers have created sensor libraries that not only provide accurate headings by keeping the sensors in proper calibration, but also mitigate against the distorting effects of external magnetic interferences.

By including these advanced sensor libraries, smartphone and tablet OEMs allow app developers to track the movements of the mobile devices and their users. So that, by observing motion, applications can allow users to interact with their devices through new, convenient gestures. For example, when users bring their phones to their ears, an app can automatically be readied to accept voice commands.

However, even when we include motion tracking, we are still under-utilizing the capabilities of the sensors in mobile devices today. And, with advances in sensor design and manufacturing, increasingly powerful sensors are now available in smartphones at low costs:

  • accelerometers in smartphones today can pick up the sound of key strikes on an alphanumeric keyboard with such precision that a computer program can determine what keys are being struck;
  • magnetometers can detect the 50/60 Hz magnetic field emanating from a power cord;
  • barometers can notice the atmospheric change between floors of a building;
  • and microphones can record ultrasound.

By comparison, human motions are relatively slow (below 20 Hz) and not very subtle. To fully take advantage of the expanding portfolio of sensing capability of a smartphone, we must begin to think beyond using sensors simply for motion and gesture processing, and begin to consider using them to get a better understanding of a user’s contexts and intents. This expands the role of mobile devices beyond just sensing the environment, to ‘knowing’ the user well enough to offer considered advice.

Instead of asking users to learn a set of special gestures to interact with their devices, the devices will thus begin to learn more about their users. So when a user is keeping her phone in her purse, a sensor would keep the backlight off to avoid battery draw. When a user is running to catch his train, the phone would send incoming calls to voicemail rather than disturb the runner. A smart mobile companion could even learn its user’s daily activities so well, including the user’s habit of daily jogging, that it could suggest a nearby running trail when the user is in a foreign city.

At the dawn of this age of smart mobile devices, it’s worth remembering the vision that begat General Magic and Apple’s Newton 20 years ago: “a dream of improving the lives of many millions of people by means of small, intimate, life support systems that people carry with them everywhere.1

While the visions of the pioneers outstripped the technology available at the time, new inventions continue to deliver on that dream. To become intimate, a mobile device must know what its user is doing, and in what environment. To provide life support, the device must understand life, not as a continuous sequence of motion, but as snapshots of various activities.

By understand users’ activities, mobile applications can become ever-present and intimate companions to their users, monitor their situations, discern their habits, and better infer their contexts and intents. This in turn will shape even newer applications of sensors; and the vision moves closer to realization.

1 General Magic company mission statement, May 1990 – Mountain View, CA.

About the Author

Ian Chen | Executive Director, Industrial & Healthcare Business Unit, Maxim Integrated

Ian Chen works at Maxim Integrated as an executive director for Industrial and IoT Sensors. He has more than 25 years of semiconductor experience, with the last 15 years focused on sensors and sensing applications. He has held senior business, marketing, and engineering leadership positions at both startups and multi-national corporations, and developed a track record of identifying emerging opportunities early and executing to capture them. He was identified by EE Times as one of the “Top 40 technologists to watch” and included in the “Top 50 Who’s Who in Sensor Technology” by Fierce Electronics.

Ian has a Bachelor’s and Master’s in electrical engineering, as well as an MBA, all from the University of Illinois at Urbana-Champaign. He holds more than 20 patents.

Sponsored Recommendations

Comments

To join the conversation, and become an exclusive member of Electronic Design, create an account today!