Tim Cornelissen
Experience Prototyper @ Apple
Experimental Psychology PhD
Creative Technologist
Experience Prototyper @ Apple
Experimental Psychology PhD
Creative Technologist
I'm an experience prototyper (sometimes called creative technologist). I build experiences to test whether they are worth fully developing. My projects are inspired by new technologies and real-world problems. Along the way I (roughly) specify what a technology needs to be capable of in order to create a desirable user experience. "New technology" covers a broad spectrum for me. It can be hardware, software (like AI), or both. Since early 2018 I work on the future of all things display and optics at Apple in Cupertino.
I'm not a traditional engineer. My PhD is in Experimental Psychology. My background helps me think in a systematic and scientific way, comes with useful data analysis skills, and helps me talk to user research teams.
My technical skills are partly self-taught, but mostly learned from the talented people I've worked with over the past 10+ years. You could say I'm an expert-generalist. I usually know where to start and I learn the things required to build what's needed.
Apple Inc.
Cupertino, CA, USA
Apple Inc.
Cupertino, CA, USA
Apple Inc.
Cupertino, CA, USA
Apple Inc.
Cupertino, CA, USA
Scene Grammar Lab, Goethe University
Frankfurt, Germany
Humanities Laboratory, Lund University
Lund, Sweden
Utrecht University
Utrecht, The Netherlands
Cornelissen, T., Sassenhagen, J., & Võ, M. L.-H. (2019) Improving free-viewing fixation-related EEG potentials with continuous-time regression. Journal of Neuroscience Methods.
Niehorster, D.*, Cornelissen, T.*, Holmqvist, K., & Hooge, I. (2018) Searching with and against each other: spatiotemporal coordination of visual search behavior in collaborative and competitive settings. Attention, Perception, & Psychophysics.
Cornelissen, T. & Võ, M. L.-H. (2016) Stuck on semantics: Processing of irrelevant object-scene inconsistencies modulates ongoing gaze behavior. Attention, Perception, & Psychophysics.
* authors contributed equally to the paper
Lauer, T., Cornelissen, T.H.W., Draschkow, D., Willenbockel, V., & Võ, M. L.-H. (2018). The Role of Scene Summary Statistics in Object Recognition. Scientific Reports.
Hessels, R.S., Benjamins, J.S., Cornelissen, T.H.W., & Hooge, I.T.C. (2018). A validation of automatically-generated Areas-of-Interest in videos of a face for eye-tracking research. Frontiers in Psychology.
Hessels, R.S., Holleman, G.A., Cornelissen, T.H.W., Hooge, I.T.C. & Kemner, C. (2018). Eye contact takes two: autistic and social anxiety traits predict gaze behavior in dyadic interaction. Journal of Experimental Psychopathology.
Niehorster, D., Cornelissen, T., Holmqvist, K., Hooge, I., & Hessels, R. (2017). What to expect from your remote eye-tracker when participants are unrestrained. Behavior Research Methods.
Hessels, R.S., Cornelissen, T.H.W., Hooge, I.T.C. & Kemner, C. (2017). Gaze Behavior to Faces During Dyadic Interaction. Canadian Journal of Experimental Psychology.
Nyström, M., Niehorster, D., Cornelissen, T., & Garde, H. (2016). Real-time sharing of gaze data between multiple eye trackers - evaluation, tools, and advice. Behavior Research Methods.
Hooge, I., Nyström, M., Cornelissen, T., & Holmqvist, K. (2015). The art of braking: post saccadic oscillations in the eye tracker signal decrease with increasing saccade size. Vision Research.
Hessels, R.S., Cornelissen, T.H.W., Kemner, C. & Hooge, I.T.C. (2014). Qualitative tests of remote eyetracker recovery and performance during head rotation. Behavior Research Methods.
Dalmaijer, E.S., Van der Stigchel, S., Nijboer, T.C.W., Cornelissen, T.H.W. & Husain, M. (2014) CancellationTools: All-in-one software for administration and analysis of cancellation tasks. Behavior Research Methods
Below are some projects I worked on that have become public knowledge already, along with some patents I can share. I'd really love to tell you more but most of what happens inside Apple Park stays inside for a few more years.
Screen Distance can help children engage in healthy viewing habits that can lower
their risk of myopia and can give people of all ages the opportunity to reduce digital eyestrain.
Viewing something like a device or book too closely for an extended period of time
can increase eye strain and the risk of myopia. The Screen Distance feature detects
when you hold your iPhone closer than 12 inches for an extended period, and encourages
you to move it farther away.
I was deeply involved in guiding this project from its earliest stages. My tasks included:
protoyping initial experiences, showing technical feasibility, setting specifications and requirements,
and advocating for the feature. Of course nothing like this happens without very talented
clinicians, scientists, and developers around you.
"You navigate by simply using your eyes, hands, and voice"
Developed an eye movement classification algorithm that contributes to all Vision Pro interactions.
Provided eye movement expertise for model architecture and data labeling instructions. Created metrics to evaluate ML models against ground truth.
Collaborated with vision scientists, machine learning experts, and developers to make this a reality.
A MacOS app that takes an audiogram or the user's age and outputs optimized parametric EQ filters to compensate for their hearing loss. Ready to copy into software like Roon or a hardware equalizer.
Audiophiles spend endless amounts of time and money optimizing their gear. Better equipment, room compensation sofware, acoustic treatment. etc. Rarely do they consider the last bit of the signal chain: their ears.
An always-on display for Roon's audio streaming platform. Perfect for Raspberry Pi with a touchscreen or a spare iPad. The UI is designed to expose fine controls on touch, but turn to a pretty and easy-to-read information display when left alone.
Roon is a great piece of music software for the home. Still, I don't love getting out my phone (with all its distractions) to skip songs or to check what song is playing.
This app has two parts. There is the java backend that looks for your Roon core on the network and interacts with it. It uses websockets to talk to the front-end, which is basically a webpage you can point a browser or a kiosk app to.