Skip to content
View toshikurauchi's full-sized avatar

Highlights

  • Pro

Organizations

@caelum
Block or Report

Block or report toshikurauchi

Block user

Prevent this user from interacting with your repositories and sending you notifications. Learn more about blocking users.

You must be logged in to block users.

Please don't include any personal information such as legal names or email addresses. Maximum 100 characters, markdown supported. This note will be visible to only you.
Report abuse

Contact GitHub support about this user’s behavior. Learn more about reporting abuse.

Report abuse
toshikurauchi/README.md

Typing SVG

or, according to my ID, Andrew Toshiaki Nakayama Kurauchi.

LinkedIn

I am an Assistant Professor in Computer Science and Engineering at Insper, with a primary focus on teaching. I am passionate about building interactive applications and tools, especially in assistive technology and teaching contexts. I particularly enjoy working with eye trackers and gaze-based interaction.

What I've built for research

These are some of my research projects:

CameraMouseSuite [cross-platform version]

Qt implementation of Camera Mouse Suite, a mouse-replacement interface that allows users to control the mouse pointer using body movements (e.g. head) captured by a webcam. As the user moves their head (or other body part being tracked by the camera), the mouse pointer replicates their movement. Clicks are performed with dwell time (keeping the mouse pointer still for a certain amount of time).


Haytham Linux

Cross-platform mobile gaze tracking software based on Haytham by Diako Mardanbegi. The mobile eye tracker must be built using at least two cameras: one to capture the scene and the other to capture the eye image. Some infrared light must be attached to the eye camera and an infrared filter (an exposed film will do) must also be added.


EyeSwipe

Gaze-based text entry method that uses gaze gestures to type words instead of typing letter by letter with dwell-time. The initial and final letters of the word are indicated by performing an eye gesture called "reverse crossing", in which the user looks at a button displayed above the key and then looks back at the key to finish the selection.


Swipe&Switch

An evolution of EyeSwipe that uses context switching between regions as a selection method. There are three regions: Text, Action, and Gesture regions. To type a word, the user looks at the Gesture region, glances at the letters that form the desired word and then moves their gaze to either the Text or Action regions.


HMAGIC: Head Movement and Gaze Input Cascaded Pointing

Head Movement And Gaze Input Cascaded (HMAGIC) pointing is a technique that combines head movement and gaze-based inputs in a fast and accurate mouse-replacement interface. The interface initially places the pointer at the estimated gaze position and then the user makes fine adjustments with their head movements.


Heatmap Explorer

An interactive gaze data visualization tool for the evaluation of computer interfaces. Heatmap Explorer allows the experimenter to control the visualization by selecting temporal intervals and adjusting filter parameters of the eye movement classification algorithm.

What I've built for teaching

Here's some stuff I've built for the courses I teach:

Popular repositories

  1. MaskedEditText MaskedEditText Public archive

    Android EditText with customizable input mask.

    Java 151 54

  2. tobii_stream_py tobii_stream_py Public

    Cython wrapper for Tobii Stream API

    C 5

  3. CameraMouseSuite-cross-platform CameraMouseSuite-cross-platform Public

    CameraMouseSuite cross platform version. Implemented with Qt and OpenCV.

    C++ 4 2

  4. servidor-de-desafios-frontend servidor-de-desafios-frontend Public

    JavaScript 4 1

  5. AnimationTest AnimationTest Public

    Teste de Animations no Android

    Java 3 3

  6. SplineBasedFiberTracking SplineBasedFiberTracking Public

    C 2