Thursday, September 12, 2013
7:00 PM
EyeVerify seeks our help to understand how to best encourage users to hold a smartphone about 8 inches from their face and then glance left and right. During this process, EyeVerify captures a video stream to pull frames. While EyeVerify works with a front-facing camera (poorer quality) on some newer phones, for most phones, EyeVerify needs to use the back-facing camera (higher quality). EyeVerify would like to explore how to use sound, haptic (e.g., vibration), and/or visual prompts (when the user can see the screen) to guide users. For example, Everyday – http://everyday-app.com – is a self-portrait app that aligns a user's face via visual cues of vertical (for nose) and horizontal (for eyes and mouth) lines (see screenshots below).
To get a feel for the EyeVerify app, please review the following videos and articles:
* EyeVerify Demo – http://vimeo.com/53565077
* How It All Works – http://eyeverify.com/how-it-works.php
* Eye Verification – http://eyeverify.com/how-it-works.php?DId=1
* CNET Video – “Eyeprints Eye-Scanning Security Tech Exposed in Video”
http://cnettv.cnet.com/eyeprints-eye-scanning-security-tech-exposed-video/9742-1_53-50141847.html
* Huffington Post – “EyeVerify’s Vein Popping Password Technology: Interview With CEO Toby Rush”
http://www.huffingtonpost.com/james-grundvig/eye-veins-password-technology_b_2564615.html
* MIT Technology Review – “Instead of a Password, Security Software Just Checks Your Eyes”
http://www.technologyreview.com/news/507901/instead-of-a-password-security-software-just-checks-your-eyes
EyeVerify understands that a huge key to its success is getting the user experience right for both the first training period as well as ongoing uses. Most UX folks EyeVerify has talked to understand screen layout and flow – few have had to encourage users to hold the phone a certain way and perform an action….this is where IxDA KC can help.
Assuming the use of a 2 megapixel camera (or better), EyeVerify can detect the following aspects in its app:
When no, one, or both eyes are in view, If the phone is too far away, and When you glance and which way you glance.
Ultimately, we need to help users by expanding the EyeVerify interface to provide an optimal user experience – ideally – by leveraging cues to use the back and/or front camera(s).
Please review the links above before we meet – after a brief hands-on app demo, we'll break into groups of 3-5 to brainstorm and then reconvene to present ideas to EyeVerify's CEO.
10 min intros
30 min overview
30 min problem solving session
15 min flex time?
(3) 5 min presentations (assuming 3 groups)
10 min final Q&A
To get a feel for the EyeVerify app, please review the following videos and articles:
* EyeVerify Demo – http://vimeo.com/53565077
* How It All Works – http://eyeverify.com/how-it-works.php
* Eye Verification – http://eyeverify.com/how-it-works.php?DId=1
* CNET Video – “Eyeprints Eye-Scanning Security Tech Exposed in Video”
http://cnettv.cnet.com/eyeprints-eye-scanning-security-tech-exposed-video/9742-1_53-50141847.html
* Huffington Post – “EyeVerify’s Vein Popping Password Technology: Interview With CEO Toby Rush”
http://www.huffingtonpost.com/james-grundvig/eye-veins-password-technology_b_2564615.html
* MIT Technology Review – “Instead of a Password, Security Software Just Checks Your Eyes”
http://www.technologyreview.com/news/507901/instead-of-a-password-security-software-just-checks-your-eyes
EyeVerify understands that a huge key to its success is getting the user experience right for both the first training period as well as ongoing uses. Most UX folks EyeVerify has talked to understand screen layout and flow – few have had to encourage users to hold the phone a certain way and perform an action….this is where IxDA KC can help.
Assuming the use of a 2 megapixel camera (or better), EyeVerify can detect the following aspects in its app:
When no, one, or both eyes are in view, If the phone is too far away, and When you glance and which way you glance.
Ultimately, we need to help users by expanding the EyeVerify interface to provide an optimal user experience – ideally – by leveraging cues to use the back and/or front camera(s).
Please review the links above before we meet – after a brief hands-on app demo, we'll break into groups of 3-5 to brainstorm and then reconvene to present ideas to EyeVerify's CEO.
10 min intros
30 min overview
30 min problem solving session
15 min flex time?
(3) 5 min presentations (assuming 3 groups)
10 min final Q&A
0 Response to "September 12th: IxDAkc Meeting about EyeVerify in Kansas City"
Post a Comment