Designing Across Senses: A Multimodal Approach to Product Design

This document was uploaded by one of our users. The uploader already confirmed that they had the permission to publish it. If you are author/publisher or own the copyright of this documents, please report to us by using this DMCA report form.

Simply click on the Download Book button.

Yes, Book downloads on Ebookily are 100% Free.

Sometimes the book is free on Amazon As well, so go ahead and hit "Search on Amazon"

Today we have the ability to connect speech, touch, haptic, and gestural interfaces into products that engage several human senses at once. This practical book explores examples from current designers and devices to describe how these products blend multiple interface modes together into a cohesive user experience.

Authors Christine Park and John Alderman explain the basic principles behind multimodal interaction and introduce the tools you need to root your design in the ways our senses shape experience. This book also includes guides on process, design, and deliverables to help your team get started.

The book covers several topics within multimodal design, including:

  • New Human Factors: learn how human sensory abilities allow us to interact with technology and the physical world
  • New Technologies: explore some of the technologies that enable multimodal interactions, products, and capabilities
  • Multimodal Products: examine different categories of products and learn how they deliver sensory-rich experiences
  • Multimodal Design: learn processes and methodologies for multimodal product design, development, and release

Author(s): Christine W. Park, John Alderman
Edition: 1
Publisher: O'Reilly Media
Year: 2018

Language: English
Commentary: Revision History for the First Edition: 2018-03-07 First release || LOC uses two DDC numbers: 658.5752 and 005.4/37 in that order (LCCN 2018276453) || UDC seems to allow combinations thus it was used there.
Pages: 296
City: Sebastopol, CA
Tags: Industrial Design; Computer-Aided Design; Senses; Sensation; Multimodal User Interfaces; Computer Systems; Product Design

[Preface]
Chapter 1: Returning to Our Senses
If a Tree Falls in the Forest…
The Sound of Violence
Experience Is Physical
People Have Modalities
Devices Have Modes
Human Modalities + Device Modes = Interfaces
Physical Information: The New Data
Sensing, Understanding, Deciding, and Acting: The New Human Factors
Focus: The New Engagement
Multimodality Makes a Wider Range of Human and Product Behaviors Possible
How Multimodality Affects Design
Multimodal Design Is Cross-Disciplinary
Summary
Chapter 2: The Structure of Multimodal Experiences
The Human Slice of Reality: Umwelt and Sensibility
Assembling Multimodal Experiences: Schemas and Models
The Building Blocks of Multimodal Experience
Summary
Chapter 3: Sensing
The Three Main Categories of Stimuli
Defining the Senses: Dimension, Resolution, and Range
Sensory Focus: Selecting, Filtering, and Prioritizing Information
Reflexes
Our Senses and Their Unique Properties
Vision
Hearing
Auditory Interfaces
Touch (Somatosensory or Tactile Abilities)
Smell (Olfactory Ability)
Taste (Gustatory Ability)
Sixth Senses and More
Summary
Chapter 4: Understanding and Deciding
The Foundations of Understanding: Recognition, Knowledge, Skills, and Narratives
Aware and Non-Aware: Fast and Slow Thinking
Agency: Balancing Self-Control and Problem Solving
Motivation, Delight, Learning, and Reward: Creating Happiness
Summary
Chapter 5: Acting
About Anthropometrics
Task Performance
Nonverbal Communication
Precision Versus Strength
Inferring Versus Designating Intent
Summary
Chapter 6: Modalities and Multimodalities
Modalities: How We Use Our Senses
We Shape Our Modalities, and They Shape Us
Attributes and Abilities of Modalities
Applying Modalities to Design
Multimodalities
Trusted Version and Performance Optimization
Attributes and Abilities of Multimodalities
Common Categories of Multimodalities
Applying Multimodality to Design
Summary
Chapter 7: The Opportunity: Transforming Existing Products and Developing New Ones
Key Applications of IoT: Monitor, Analyze and Decide, Control and Respond
“Disruptive” Technologies
Beginning Inquiry
Summary
Chapter 8: The Elements of Multimodal Design
Using Physical Information
Constructing Knowledge, Interactions, and Narratives
Summary
Chapter 9: Modeling Modalities and Mapping User Experiences
Behaviors Shared Between Users and Devices
Demanding Contexts and Complex Interactions Call for Alignment
Experience Maps and Models for Multimodality
Key Considerations of Multimodal Design
Example Maps and Models
Summary
Chapter 10: Form Factors and Configurations
Creating Multimodal Properties
Configuring Interface Modes
Mapping Modal Behaviors to Modal Technologies
Summary
Chapter 11: Ecosystems
Device Ecosystems
Information Ecosystems
Physical Ecosystems
Social Ecosystems
Specialized Ecosystems
Cloud Architectures: Distributing Resources Through Connectivity
Ecosystem and Architecture: Applying Ecosystem Resources to Multimodal Design
Sensing Experiences: Answering the Door—A Doorbell, Ring, and the August Lock
Understanding and Deciding Experiences: Determining Distance—a Pedometer, Apple Watch, and Lyft App
Acting Experiences: Writing and Drawing—A Pencil, a Tablet, and the Apple Pencil
Summary
Chapter 12: Specifying Modalities: States, Flows, Systems, and Prototypes
Introduction: A Prototype Is a Custom Measuring Tool
Practice Makes Perfect
The Media of Multimodal Products: Information and Interactions, Understandings and Behaviors
The Product Development Process for Multimodal Products
Defining Design Requirements
Specifying Multimodalities
Summary
Chapter 13: Releasing Multimodal Products: Validation and Learning
Release Is About Process
Alpha Release
Beta Release
Public Release
The Out-of-the-Box Experience
Summary
Further Reading
Glossary
[ Index ]