How Human-Computer Interaction (HCI) Helps to Guide Better UI Design

By Carey Wodehouse

Before laptops, smartphones and mobile applications; before graphical user interfaces (GUIs), browsers and search engines; before user interface (UI) and user experience (UX) design, there was human-computer interaction (HCI)—the study of better, more intuitive ways for humans to interact with technology. Like Design Thinking, HCI is centered on the user: how they behave, how they interact with technology, and what their needs and goals are.

This broad discipline predates UI design and UX design as the very first way programmers sought to make the early desktop computers more user-friendly. Here’s a look at HCI’s beginnings, its evolution, and how it can continue to inform modern UI/UX design.

The Evolution of HCI and the User Interface

In the 1970s, computers with text-only commands and clunky interfaces were popping up everywhere, and they were understandably confounding to the average user. This “software crisis” bloomed as computers became more widespread but remained virtually unnavigable by non-engineers. For programmers, this presented a specific problem: How to make computers easier to interact with.

The need arose for HCI—a new discipline merging cognitive science (how the mind works) and engineering (how computers work). As one of the earliest examples of cognitive engineering, HCI’s many models, theories, and frameworks created a new vision for technology: to empower users by understanding how they think and what they need.

By the 1980s, HCI was a narrow but revolutionary discipline that was paving the way. The focus shifted to productivity and making programs on personal computers easier to use. Enter computer graphics, the true beginning of HCI, and the visual “desktop” with its various icons, creating a visual way to organize and find files and folders. The first user-driven interfaces—accessed via keyboards, the mouse, and even the lowly cursor—were all designed to be simple, useful, and intuitive, thanks to HCI. Users could interact with computers with clicks, not typed commands.

The visual desktop became a bit limited, though, and was difficult to use at scale. With the rise of the internet in the 1990s, search took over as an easier way for people to find things on their computers. Activity moved from the desktop to the browser, and computers evolved to be more like conduits for different tasks—email, chat, website browsing, sharing photos, etc. In light of this, the goal for an interface became to design it in a way that was virtually invisible to a user—so easy to use that it was second-nature and accomplishing the task was the focus.

Fast-forward to today—when hardware, software, and the way we interact with computers are constantly evolving—and HCI remains relevant for UI designers and engineers alike who want to investigate the “Why” behind the “How” of the interfaces they design.

The answer to that “Why?” will almost always be: To make interfaces “easy to learn, and easy to use,” an early mantra of HCI.

Create an awesome job post that attracts freelancers with the skills you need for your project.

From Then to Now: The Evolution of Usability

Over the decades, HCI has become a web of Go to the full article.

Source:: Business 2 Community

Be Sociable, Share!