HCI Overview Week 48 - HCI History
HCI Overview Week 48 - HCI History
HCI Lecture Notes: A History of HCI
These notes compile and organize the professorâs lecture on the history of Human-Computer Interaction (HCI). They include all major examples and key points (with details of demonstrations, technologies, and historical context). The professorâs main goal is to provide a broad framework to understand why modern computing looks the way it does today, and how it evolved from early mechanical devices to the systems we have now.
Part I: From Mechanical to Digital Computing
1. The Differential Analyzer (circa 1930)
- Built at MIT around 1929â1930.
- A mechanical computer occupying an entire room.
- Programmed (set up) by manually connecting rods, gears, and wheels to solve differential equations faster than by hand.
- HCI Elements (even then):
- Real-time interaction: Operators could observe and tweak the machineâs mechanical operations in real time.
- Physical controls and direct manipulation: Adjusting knobs, levers, etc.
- Immediate feedback loop between operator and machine, though in a very mechanical sense.
2. World War II and Early Electronic Computers
- WW II drove massive investment in computing for:
- Codebreaking (Enigma, Colossus in the UK, etc.).
- Artillery firing tables (ENIAC in the US).
- Manhattan Project (nuclear bomb design relied on large-scale calculation).
- ENIAC:
- One of the first fully electronic, Turing-complete computers.
- Size of a room, used vacuum tubes (âbugsâ sometimes literally insects causing issues).
- Programmed by rewiring and flipping switches: a very physical form of âinterface.â
- Often required multiple human operators (many were women âcomputersâ/operators).
3. The Big Divide: AI vs. IA
- AI (Artificial Intelligence): Aims to emulate/replace human intelligence (John McCarthy as one of the founders).
- IA (Intelligence Augmentation): Computers as tools to extend human capabilities (J.C.R. Lickliderâs Man-Computer Symbiosis).
- AI had âAI summersâ (bursts of funding/interest) and âAI wintersâ (loss of interest/funding).
- IA (user-centric or human-centered computing) continuously influenced interactive computing development.
4. Vannevar Bush and the Memex (1945)
- Article: âAs We May Thinkâ
- Post-WWII observation: Information overload due to massive war research efforts.
- Proposed the Memex:
- A theoretical personal information machine (never built).
- Emphasized associative linking (precursor to hypertext).
- Envisioned multiple screens, a stylus for annotation, microfilm storage, and hyperlinks.
- Concept of âtrailsâ of linked information, anticipating the World Wide Webâs structure decades later.
5. The Transistor Revolution (1947 onward)
- Bell Labs invents the solid-state transistor (1947).
- More reliable, smaller, faster than vacuum tubes.
- Enabled computers to shrink from room-sized mainframes to eventually desktop devices.
- Mooreâs Law: Computing power doubling approximately every 2 years (though now hitting physical limits).
Part II: The Shift to Interactive Computing (1950sâ1960s)
1. Early Interactive Machines
- MIT Lincoln Labs TX-2 (1958):
- First computer designed for interactive use (rather than batch processing).
- Large display, light pen input, real-time feedback.
- Enabled multiple users via time-sharing.
- PDP-1 (circa 1960) from Digital Equipment Corporation:
- Commercial interactive computer, ~$120,000 (cheaper than million-dollar mainframes).
- Size of a couple refrigerators.
- Hacker culture developed here (people could directly experiment in real time).
- Spacewar! (1962):
- First digital real-time game, vector-graphics-based.
- Multi-player (2 ships fighting, influenced by simulated gravity).
- Demonstrated that computers werenât just for âseriousâ tasks; graphics and fun are legitimate.
2. Ivan Sutherlandâs Sketchpad (1962)
- Created on the TX-2.
- Major innovations:
- Graphical user interface with a light pen.
- Direct manipulation: Drawing lines, circles, using a ârubber bandâ effect.
- Constraints: âGravityâ snapping endpoints or lines precisely.
- Demonstration video shows how the user can:
- Draw, move, and constrain shapes in real time.
- Zoom in/out, use pen-based interactions.
- Precursor to modern CAD (Computer-Aided Design) and drawing programs.
3. âThe Mother of All Demosâ (Doug Engelbart, 1968)
- Famous 90-minute demo showcased decades of research from SRI (Stanford Research Institute).
- Revolutionary features in a single demonstration:
- Computer Mouse (first public use).
- Hypertext for collaborative document editing.
- Windows on-screen, mixed text/graphics.
- Real-time video conferencing: Remote collaborators could see each other and point at on-screen content.
- By-manual interaction: One hand on the mouse, the other on a chorded keyboard.
- Essentially predicted modern GUIs, online collaboration (like Google Docs), and even Zoom-like video calls.
4. First VR Prototype (1968)
- âSword of Damoclesâ by Ivan Sutherland.
- Massive head-mounted display suspended from the ceiling.
- Real-time head-tracking, wireframe 3D graphics, mechanical and ultrasonic sensors.
- Marked the birth of Virtual Reality (VR) and Augmented Reality (AR) concepts.
Part III: Personal Computing (1970sâ1980s)
1. Alan Kay and the Dynabook Vision
- Alan Kay at Xerox PARC envisioned a âDynabookâ (early 1970s):
- Personal, tablet-like form factor with a keyboard.
- Intended for childrenâcomputing as a creative medium accessible to everyone.
- Wireless networking predicted (though not feasible at the time).
- Led to Smalltalk (object-oriented programming language).
2. Xerox PARC and the Alto (1973)
- Xerox Alto (research prototype):
- First modern GUI: overlapping windows, icons, a mouse, and networking (Ethernet).
- Object-oriented software with Smalltalk.
- Extremely expensive ($10kâ$70k in todayâs dollars), mainly for internal research.
- Influence: Set the stage for the WIMP (Windows, Icons, Menus, Pointer) paradigm.
3. 1977: The âTrinityâ of Home Computers
- Commodore PET, Apple II, TRS-80:
- All under $1,000; truly mass market.
- Built-in keyboards, BASIC programming, cassette tape storage.
- Millions sold, introduced personal computing into homes and schools.
4. The Killer App: VisiCalc (1979)
- First electronic spreadsheet (on Apple II).
- Automatic recalculation, cell grids, copyable formulas.
- Immediate success: Sold 100k copies in the first month, 1â2 million soon after.
- Showed that productivity software could drive hardware sales.
5. Xerox Star (1981) and the Desktop Metaphor
- Xerox Star: commercial successor to Alto.
- First fully realized desktop metaphor with icons, menus, consistent GUI.
- Very expensive (~$16k per machine, up to $50k for a complete system).
- Commercial failure but huge influence on later GUIs (Apple, Microsoft).
6. Apple Macintosh (1984)
- Affordable GUI (~$2,500), mainstreaming the desktop metaphor.
- Mouse-driven, consistent UI, WYSIWYG design (particularly for desktop publishing).
- No built-in networking yet (used âsneakernetâ with floppy disks).
- Became famous for ease of use, design aesthetics.
7. Microsoft Windows (1985)
- Initially ran on top of MS-DOS.
- Bill Gates recognized the GUI trend.
- Lower hardware requirements and open hardware ecosystem (IBM PC compatibles) allowed broad adoption.
- By mid-1990s (with Windows 3.1, later Windows 95), it dominated the enterprise and consumer market.
8. NASA and VR (Mid-1980s)
- NASA Ames âVIEWâ or âVIVEDâ systems (~1985).
- Advanced VR prototypes using headsets, gloves, etc.
- Continued the legacy of Sutherlandâs Sword of Damocles but still not consumer-ready.
9. Early Collaborative Editing (1989)
- Xerox PARC research on multi-user doc editors:
- Real-time editing by multiple collaborators on the same document.
- Invented operational transforms to handle conflicting edits, show multiple cursors, etc.
- Precursor to modern Google Docs, Office 365 co-authoring, etc.
Part IV: The Web and Ubiquitous Computing (1990sâ2000s)
1. World Wide Web (early 1990s)
- Tim Berners-Lee at CERN:
- Invented HTML, HTTP, URL concepts.
- A global hypertext system on top of the existing internet.
- Netscape Navigator spurred the first widespread web adoption.
- JavaScript (1995) brought dynamic content to web pages.
2. 3D Graphics and Gaming (mid-1990s)
- 3D accelerator cards (e.g., 3dfx Voodoo) for PCs.
- Enabled higher-resolution, full-color 3D rendering:
- Quake vs. GL Quake comparison showed huge leaps in visual fidelity.
- Drove gaming industry growth and shaped modern GPU-based computing.
3. Mark Weiser and Ubiquitous Computing (late 1980sâ1990s)
- Xerox PARC again at the forefront:
- Mark Weiser coined âUbiquitous Computing.â
- Vision: Many computers per person instead of one person per computer.
- âTabs, pads, and boardsâ: small wearable devices, tablet-sized devices, and wall-sized displays.
- Idea: Computing should blend seamlessly into daily life and the environment.
- Modern reality:
- We have smartphones, smart devices, IoT, which partly fulfills Weiserâs vision (though not exactly in âgrab any deviceâ fashion).
4. Tangible and Interactive Surfaces
- Researchers (e.g., Sony Labs demo shown in the lecture) experimented with:
- Projected surfaces, interactive tables, pen-based interactions.
- Combining physical objects with digital displays.
5. Web 2.0 (2000s)
- Move from âread-onlyâ pages to user-generated content and social platforms:
- Wikipedia, MySpace, YouTube, Facebook, Twitter, etc.
- Cloud-based collaboration:
- Google Docs introduced real-time collaborative editing in a standard web browser.
- Dropbox simplified cross-device file sync.
- Shifted how we view documents, version control, and âlocationâ of data.
6. Smartphones: The iPhone (2007)
- Appleâs iPhone: Full-scale computing in your pocket with multi-touch.
- Direct manipulation with fingers, no stylus required.
- Sensors (accelerometers, GPS) for new interaction paradigms (location-aware apps).
- Led to mobile-first design, responsive web, and an explosion in app ecosystems.
7. Minority Report (Movie, 2002)
- Tom Cruiseâs iconic gesture-based âairâ interface.
- Inspired real research into full-body/gesture-based interaction (e.g., Wii, Xbox Kinect).
8. The iPad (2010)
- Returned to Alan Kayâs decades-old DynaBook concept:
- A touchscreen tablet with instant-on computing, day-long battery life.
- âLaptop + smartphoneâ hybrid for content consumption and creation.
Part V: 2010s to Present â Voice, AR/VR, and Beyond
1. Voice Assistants
- Siri (Apple, 2011), Google Assistant, Amazon Alexa (2014), Microsoft Cortana:
- Natural language UIs still challenging for discoverability and feedback.
- Rely heavily on cloud-based speech recognition and AI.
2. Virtual Reality Goes Mainstream
- Oculus Rift Kickstarter (2012), later acquired by Facebook (2014).
- Rapid improvements in VR headsets:
- High-resolution displays, inside-out tracking, wireless capability.
- Still a focus on gaming, with potential for training, design, telepresence, etc.
3. Augmented Reality
- From Google Glass (2013) and Microsoft HoloLens (2016) to phone-based AR (e.g., Pokémon Go, 2016).
- Apple Vision Pro (announced 2023) signals next steps in consumer âmixed reality.â
- Ongoing challenges:
- Social acceptance, hardware comfort, visual fidelity, latency.
4. Human-Centered AI
- Rise of deep learning and large language models.
- Growing focus on how AI systems interact with humans (e.g., âhuman-in-the-loopâ).
- Emphasis on explanation, trust, transparency, and designing interfaces around AI capabilities.
Jonathan Grudinâs Five Stages of HCI Evolution
In his paper (early 1990s), Jonathan Grudin described the evolution of computing in terms of the user:
- 1st Stage: Mainframes (1950s)
- Users = Engineers, specialized operators.
- Research methods: Minimal, purely technical/electrical engineering perspective.
- 2nd Stage: Interactive Terminals (1960s)
- Users = Programmers (still technical, but not necessarily hardware engineers).
- Human factors and basic experiments start playing a role.
- 3rd Stage: Individual End Users (1970sâ1980s, rise of personal computers)
- Users = Office workers, managers, home hobbyists.
- Cognitive psychology becomes important for interface design.
- 4th Stage: Broader End-User Tools (Late 1980sâ1990s)
- Users = Wider public, many roles.
- Social psychology and field studies come into play (groupware, collaborative systems).
- 5th Stage: Group/Collaborative Computing (Late 1990sâ2000s)
- Users = Distributed teams, entire organizations or social communities.
- Need anthropology, CSCW (Computer-Supported Cooperative Work), advanced research methods, etc.
Homework/Exercise:
- Grudinâs table ends at these 5 stages. Consider adding a 6th or even 7th stage to reflect:
- Modern smartphone era.
- IoT/ubiquitous computing.
- AI-driven interfaces.
- AR/VR for social and remote collaboration.
Q&A and Concluding Points
- Question: Will AR/VR replace smartphones and PCs?
- Professorâs View: Likely not replace but add to our ecosystem. Historically, no medium fully kills a previous one (e.g., TV didnât kill radio).
- We see a convergence of devices (spatial computing, wearable AR) that might become significant in 1â2 decades, but PCs and phones will remain.
- Takeaway:
- HCI has deep historical roots.
- Studying old systems can inspire new ideasâmany groundbreaking concepts get âlostâ and can be rediscovered.
- Future developments will blend human-centered design with the power of AI and more immersive, ubiquitous hardware.
This post is licensed under CC BY 4.0 by the author.