How We Speak to Computers and Why that Matters

Apple’s 3D Touch technology, seen in the new 6S and 6S Plus iPhones, has finally launched in late September and according to initial feedback, customers are impressed with this new technology.

3D Touch technology refers to the next-generation “touch” interface, meaning that the screens of these devices function based upon three unique levels of pressure: a tap, a press, and a deeper press. These subtle presses made from one’s finger allow the user to access information on these devices many times faster. If you want to read more about Apple’s new 3D Touch, click here.

Okay, but, have you given much thought to touchscreens lately, or how many times in your day-to-day life you use a touchscreen?

Ever go to an ATM?

Sign your name when you use credit at the grocery store?

Buy gas and pay for it using “credit outside”?

Use a touchscreen laptop at work?

Use the kiosk at the airport to get your online ticket?

Use an iPhone or any smartphone?

Yep, that’s touchscreen technology.

It’s fascinating how much we use touchscreen technology to live, work, and manage our lives.

Touchscreen technology is a result of HCI.

In the 1980’s, touchscreen technology was in its infancy. But where did it originate?

Have you heard of Human Computer Interaction (HCI)? Back in the early 80’s, HCI was a brand new field of study. It was a specialty field that was represented by researchers, engineers, technology professionals, cognitive scientists, and devoted hobbyists. At this point in time, HCI was critically interested in the theory of usability; how to make it easier for humans to use technology, but also to make the activity intuitive. HCI worked on the creation of tools and applications to allow people to easily interact with computers so they could interact with each other. Email is a direct result of HCI.

HCI has sure come a long way, and this is evident in how you and I spend our days and evenings – pressing social media updates into our phones; pressing numbers in our phones; pressing buttons or icons on our phones to get more information. Press, press, press!

Let’s go back… further than the 1980s…further than Steve Jobs and Apple.

Have you heard of Herman Hollerith and Joseph Jacquard?

In 1801, the Frenchman Jacquard became famous for industrializing the looming machine by incorporating wooden punch cards so it could work automatically to create unique patterns and without human intervention. Of course! How do you think great, great, great grandma’s rugs were made? Punch cards are considered the earliest practice that gave birth to modern computer programming languages.

image001

Jacquard’s loom, functioning via wooden punch cards.

Image is from: http://www.computersciencelab.com/ComputerHistory/HistoryPt2.htm

Herman Hollerith, an American statistician and inventor took this novel innovation from Jacquard’s loom and customized it into a “successful information processing system.” This processing system became a counting machine that Hollerith used for calculating census figures for the U.S. census. A far cry from a loom, eh?

Herman Hollerith’s 1890 tabulating machine that was used for the 1900 U.S. Census.

image002

Image from: http://www.columbia.edu/cu/computinghistory/census-tabulator.html

After 1890, Hollerith broadened the ability of his counting machine and it was customized to aid with accounting, warehousing, and shipping applications. Hollerith became founder of the Tabulating Machine Company which later became IBM. Today, Hollerith is known as the father of modern machine data processing. Pretty cool, eh, but very “manual.” How far we have come…

Now, let’s return to more current times.

The first graphical user interface (GUI) was introduced by Xerox PARC in the 1980s (How the Human/Computer Interface Works, Livescience.com). There is an industry myth that Apple’s CEO at this time, Steve Jobs, saw this technology and took it to Apple where it was copied and eventually led to the Apple Lisa and the Macintosh (Did Steve Jobs steal everything from Xerox PARC?, MacHistory.net).

image003

The 1984 Apple Macintosh’s (Mac OS) desktop aka “graphical user interface.”

Image from https://en.wikipedia.org/wiki/Macintosh

All in all, the field of HCI is experiencing exponential growth. Case in point – ever heard of the Xbox 360 and the Nintendo Wii? Of course you have. Consumers have spent over $60 billion on video games and the hottest games are those that let you use your body to control the game. This technology is known as Natural User Interface (NUI) because the technology is “invisible” in a way as the game interacts with the technology – the game – telling it what to do (how to function) all by his/her spatial gestures rather than a joystick or game controller.

Kinect (developed by Microsoft) is the NUI technology that made the Xbox 360 such a popular video game system, allowing users to control the games with their bodies along with voice commands.

HCI has come a long way as evidenced from wooden punch cards used two centuries ago. Where are we going from here? The final frontier may be the ability to use thought commands to control devices external to our own bodies.

Sources:
https://www.interaction-design.org/literature/book/the-encyclopedia-of-human-computer-interaction-2nd-ed/human-computer-interaction-brief-intro
http://www.livescience.com/37944-how-the-human-computer-interface-works-infographics.html
http://www.mac-history.net/computer-history/2012-03-22/apple-and-xerox-parc
http://www.technologyreview.com/news/534206/a-brain-computer-interface-that-works-wirelessly/
http://www.computersciencelab.com/ComputerHistory/HistoryPt2.htm
http://www.columbia.edu/cu/computinghistory/census-tabulator.html