Thursday, September 12, 2013
6:30 PM to
The EyeVerify biometrics app leverages native cell phone cameras to image and pattern-match veins in the whites of your eye to authorize secure and convenient mobile transactions. EyeVerify seeks our help to understand how to best encourage users to center their eyes on the screen, hold the device about 8 inches away, and glance left and right. During this process, the app captures a video to pull frames. With a 2+ megapixel camera, EyeVerify detects when no/one/both eyes are in view, device distance, and when/where you glance.
While EyeVerify works with a front-facing camera (poorer quality) on some newer phones, for most phones, EyeVerify needs to use the back-facing camera (higher quality). EyeVerify would like to explore how to use sound, haptic (e.g., vibration), and/or visual prompts (when the user can see the screen) to guide users. For a comparable example, Everyday – http://everyday-app.com – is a self-portrait app that aligns a user's face via visual cues of vertical (for nose) and horizontal (for eyes and mouth) lines (see screenshots below).
To get a feel for the EyeVerify app, please review these links:
Please review the links – after a hands-on app demo, we'll brainstorm in small groups and present ideas to EyeVerify.
While EyeVerify works with a front-facing camera (poorer quality) on some newer phones, for most phones, EyeVerify needs to use the back-facing camera (higher quality). EyeVerify would like to explore how to use sound, haptic (e.g., vibration), and/or visual prompts (when the user can see the screen) to guide users. For a comparable example, Everyday – http://everyday-app.com – is a self-portrait app that aligns a user's face via visual cues of vertical (for nose) and horizontal (for eyes and mouth) lines (see screenshots below).
To get a feel for the EyeVerify app, please review these links:
• EyeVerify Demo – http://vimeo.com/53565077
• How It All Works – http://eyeverify.com/how-it-works.php
• Eye Verification – http://eyeverify.com/how-it-works.php?DId=1
• Eyeprints Eye-Scanning Security Tech Exposed in Video – http://cnettv.cnet.com/eyeprints-eye-scanning-security-tech-exposed-video/9742-1_53-50141847.html
• EyeVerify’s Vein-Popping Password Tech: Interview with CEO Toby Rush – http://www.huffingtonpost.com/james-grundvig/eye-veins-password-technology_b_2564615.html
• MIT: Instead of a Password, Security Software Checks Your Eyes – http://www.technologyreview.com/news/507901/instead-of-a-password-security-software-just-checks-your-eyesEyeVerify understands that a huge key to product success is getting the UX right for both first use and ongoing uses. Ultimately, users can be helped by expanding the app interface – ideally – by leveraging clear cues. Most UX professionals that EyeVerify has talked to understand screen layout and flow – few have had to encourage users to hold the phone a certain way and perform an action….this is where we can help (and gain valuable biometrics experience).
Please review the links – after a hands-on app demo, we'll brainstorm in small groups and present ideas to EyeVerify.
• 10 minutes: welcome & overview
• 30 minutes: interactive UX/UI group problem-solving
• 35 minutes: about 5 minutes for each group to review
• 30 minutes: usability presentation of EyeVerify app
• 15 minutes: biometrics Q&A
0 Response to "September 12th: Kansas City IxDa Meeting at H&R Block"
Post a Comment