Case Study
Unlike what the movies portray, defense systems displays are often dated in presentation and technology, using older models of computers with low-resolution displays. These displays are viewed by teams of people who track everything from satellites to missles, and look at the screens for lengthy periods of time (leading to visual fatigue). The displays were designed not only prior to modern visual design standards, but also without attention to critical concerns such as contrast, accessibility and information hierarchy.
This project presented additional obstacles to current industry practices for design and usaility testing, including restrictions placed on user internet access (the work sites are not wired), technology constraints (users could not do video interviews or online tests), and reviews were limited to high-level stakeholders who had access to email during phone conference calls.
The Process
I was hired on to support the design and visual upgrades of millitary display screens, which involved working closely with the development team to create and identify design patterns, create lo- to-hi fidelity wireframes, and to engage with the stakeholders to establish testing scenarios and create surveys for the millitary teams engaging with this technology.
Although I'm used to working with wildly varying industries, clients and projects, I have never previously worked on an engagement that focused on defense systems technologies. To get myself up to speed, I spent my first weeks immersing myself in industry research in the space technology realm. I learned everything I can about satellite tracking systems, LEO / MEO / HEO (Low, Medium and High Earth Orbit), technology contact points, contrast ratios, and the millitary teams that work to track and follow satellite defense systems.
I researched low-tech methods of engaging in usability testing with teams, and explored options for gathering research and data when the user groups didn't have access to wireless technology (and security requirements prohibited me being onsite).
Finally, I worked closely and collaboratively with the development team to identify patterns that the users needed to view while tracking linear technology systems, including performing remote systems checks and alerting the teams to any potential anomalies. Through this process, I contributed to a robust Design Sytem Library which ultimately was handed over to the client teams.
Highlights Include
- identifying requirements
- comprehensive content audit
- stakeholder interviews
- IHA interviews
- competitive analysis audit
- workshops for business stakeholders
- collaboration with development team
- wireframes / documentation / user flows
- presentations to product owners
- InVision Click-Through Prototypes
Methodologies
- Content Audit
- Data Gathering
- Distributed Team
- Industry Research
- Remote User Surveys
- Stakeholder Meetings
- Visual Designs
- Wireframes
Tools
- Abstract
- Adobe Creative Suite
- InVision
- Sketch
- Typeform
Results
Exploring multiple modalities in survey methods allowed the team to continue receiving feedback while iterating on the mission-critical design path.