Recently, I have been exploring potential collaborations with my colleagues, Dr Chao Wang and Dr Farzin Deravi at Kent in the area of bio-photonics . At the same time, a friend of mine took up a job as a manager of Centre for Nano- and Biophotonics at Ghent University. He pointed out to me there is a travel grant scheme to encourage Kent-Ghent collaboration. One things led to another, we have now been awarded a small travel grant to explore how machine learning techniques can be used to analyse cell images.
The proposed project aims to integrate machine learning based feature extraction with high-throughput cell imaging technologies for high-precision label-free cell screening. The project will bring together complementary expertise and allow both Kent and Ghent teams to explore new research opportunities in the interdisciplinary field of biophotonics, big data photonics and computational photonics. It is anticipated that this collaborative project will lead to a long-term collaboration between the groups underpinned by Kent’s policy as the European University.
I realise this is not the usual kind of research I do, but I am pretty excited to learn something new and potentially fun!
A psychology colleague, Dr Mario Weick and I have just been awarded a small grant (~£5,000) from the Faculty of Social Sciences to carry out some pilot work using virtual reality (VR) to elicit human emotions. Specifically, we aim to develop a repository of VR stimuli for researchers to elicit different affective experiences (feelings and emotions) in people. Currently, the main source for such stimuli is the International Affective Picture System (IAPS). We believe that using videos with 360 VR views could be a more effective way to elicit emotions.
This cross-School project will develop 360-degree videos designed and validated to elicit affective experiences, covering all four quadrants shown in the following image.
Currently, I have a visiting researcher from Malaysia (UiTM), Dr Emma Nuraihan Mior Ibrahim, who is interested in using 360 video VR to elicit inter-group empathic feelings.
More information to follow as the studies in VR progress.
Recently, there have been a lot of buzzes in the tech world about a decades old technology, Virtual Reality (VR). Big guys the like of Facebook, Google and Samsung are trying to convince us that VR is finally ready for the masses. But is it? I have a few PhD/MSc students wanting to do their research in VR and healthcare. I have been exploring a few popular options, including Oculus Rift (developer version 1 & 2), Google Cardboard, and Samsung Gear VR. So how good are they?
Let’s start with Oculus. Oculus Dev ver 1 was good for what it was – one of the first low cost consumer VR headsets. The resolution was not great, and one could literally count the pixels on screen. It has high latency – meaning there is a noticeable delay between controller input (e.g. rotating your head), and screen update. This results in dizziness especially for a VR experience with fast motions.
Oculus Dev ver 2 is a significant improvement. The resolution is much better, and they seem to get the latency down to an acceptable level. Even when playing a relatively fast paced game, I didn’t feel dizzy at all. However, the set up is a pain, with lots of cables and an external head tracker. Like ver 1, the headset needs to be attached to a desktop/laptop.
Google Cardboard is probably the easiest one to get into. You only need a smart phone (iOS or Android) and the cheap looking cardboard headset (costing around £4-£15). It works quite well, but it is not as immersive as Oculus or Samsung Gear VR. It is good if you want to experience what the hype VR is about, but not willing to shell out big money.
I just received a Samsung Gear VR (see figure) yesterday, and have been experimenting with it. I am quite impressed by the ergonomics and ease of use. Samsung really got almost everything right for a mobile VR headset. It works pretty much like Cardboard, where you slot in your Samsung phones (only certain high end Samsung phones are supported currently) and it just works right away – no cables! The resolution is as good as Oculus ver 2! The user interface is beautifully designed and I am blown away by some of the 360 stereoscopic videos one can get for free from the VR store. There is however a huge problem with latency. Whilst the resolution is as good as Oculus v 2, the latency issue is as bad as Oculus v 1! I got very sick after 1-2 minute playing an action game where one runs around inside human cells. It is really a shame. Perhaps with newer phones (like S7), this problem will go away? We will see. For now, Gear VR is still good for 360 videos, and other VR experience with slow or no motion.
Pruet Putjorn, my PhD student presented his research video at the School Research Conference 2016, and won the 1st prize in the research video competition. His PhD is concerned with the use of low cost smart sensing devices in primary science education in rural Thailand. The core idea is to train pupils to be more like scientists: carrying out experiments, recording data with smart sensing device and analysing data presented in child-friend visualisations on a tablet device. Well done Pruet!
Happy New Year 2016! Just before Christmas break, my students submitted their game projects for EL639 Video Game Design. I would like to share with you some of their work. All games were built in Unity3D with C#.
Neon Snake reminded me of the old Nokia Snake game. It is simple, but engaging and even meditative.
I love pixel arts! Although the gameplay of Excraft is based on conventional 2D space shooter, I like how professional everything looks.
The good old point-and-click adventure games are making a return!
Temple of Boom
This is really the Indiana Jones version of Super Monkey Ball.
I will be attending an information day event in Brussels next week on wearable technologies. I am hoping to pitch a research idea on “Internet of Skin.” This is based on a project I have been working on over the past year, developing rehabilitation technology for people with swallowing disorder. The idea is to use epidermal sensors (EMG) to track swallowing, and use sensor data to drive a rehabilitation game.
“Internet of Skin” for Personal Informatics
The Internet of things is set to transform healthcare, providing wearable technologies that help people to monitor and reflect on various aspects of their lives. We aim to develop and investigate a next generation of wearable devices and innovative software applications for health and well-being using ultra-thin, stretchable epidermal electronics – that is, electronics thin enough and flexible enough that they can be mounted on the skin and can sense physiological signals non-invasively. We call this “Internet of Skin.”
Existing devices use traditional rigid electronics and, even if well designed, are relatively bulky. Whilst this is acceptable for many applications, others require more subtle sensing devices. For example, dysphagia is a medical term describing a range of swallowing difficulties. Current dysphagia rehabilitation requires patients to wear a large, rigid electronic box on their throat that allows a therapist to collect data on swallowing habits. Given the bulky nature of these devices, they can only be worn for short periods of time in a hospital setting. We have developed a skin-based EMG sensor for tracking swallowing, and a biofeedback game driven by the sensor for dysphagia rehabilitation.
Therefore, we propose novel research and development work in the use of “Internet of Skin” in everyday situations (not controlled lab settings), for health and well-being applications.
This is a collaborative project with Yeo Research Group, VCU .
I will be visiting Universitat Pompeu Fabra, Barcelona from 2-4 Sep 2015 to give a seminar on “Engineering for humanity” (similar one I gave in TU Delft early this year), and to examine a PhD on technology use by older people. I have never been to Spain before and I was told Barcelona is the best place to visit in Spain. Pretty excited!
Our paper entitled “Emotional Correlates of Monorhinal Odor Identification” has been accepted in Laterality: Asymmetries of Body, Brain and Cognition. It is a collaborative project with Psychology at Kent and East Kent Hospitals University NHS Foundation Trust.
The abstract of the paper:
It is self-evident that smell profoundly shapes emotion, but less clear is how these capacities are linked. Here we sought to determine whether the ability to identify odors co-varies with self-reported feelings of empathy and emotional expression recognition, as predicted if the two capacities draw on common resource. Thirty eight normal volunteers were administered the Alberta Smell Test along with the Interpersonal Reactivity Index which provides a measure of emotional and cognitive empathy, and an emotional expression recognition task in which faces had to be categorised as happy, sad, angry fearful or neutral. Of secondary interest, we also tested whether odor identification correlates with more general aspects of intellectual function, as measured by the verbal and non-verbal scales of the 2nd Kaufman Brief Intelligence Test. Statistical analyses indicated that feelings of emotional empathy positively correlated with odor discrimination in right nostril, while the recognition of happy facial expressions positively correlated with odor discrimination in left nostril. Higher identification scores in left nostril were also associated with higher verbal IQ scores. These results uncover new links between the discriminatory ability of the olfactory system and emotion and verbal intelligence which, given the ipsilateral configuration of the olfactory projections, may reflect intra- rather than inter-hemispheric interactions. Given that reduced empathic concern and a difficulty distinguishing facial emotions can mark the onset of certain neurological diseases, the results give reason to further explore the diagnostic sensitivity of smell tests.
The 2014/2015 academic year has come to an end, and summer is finally upon us (despite the 15C maximum temperature today). It is time for me to prepare for the lecture/workshop for the “Research Methods in HCI” summer school! The summer school is jointly organised by Cyprus Interaction Lab and Institute of Informatics of Tallinn University, and will run from 6-10 July 2015. This event is sponsored by The Association for Computing Machinery (ACM). Please have a look at the summer school website.
Lecturers of the summer school course include:
– Dr. Chee Siang Ang, University of Kent (that would be me!)
– Dr. Duncan Brumby, UCLIC, University College London, UK
– Dr. Jettie Hoonhout, Philips Research, Netherlands
– Dr. Yiannis Laouris, Future Worlds Center, Cyprus
– Dr. Fernando Loizides, Cyprus Interaction Lab, Cyprus
– Rajiv (Raj) Arjan, Google, UK
The course will cover topics like:
1. Data collection approaches: experiments, interviews, questionnaires, observations, structured dialogue design.
2. Data analysis: quantitative data analysis (including statistical techniques), qualitative data analysis (including text, audio, video).
Look forward to Cyprus! 🙂
Our paper titled “Defining the content of an online sexual health intervention: the MenSS website” has been accepted for publication in Journal of Medical Internet Research ResProtoc. This is a collaborative project with e-Health unity UCL looking into mobile-based intervention for sexual health behaviour change.
Earlier this year, our protocol paper for the pilot randomised controlled trial was accepted by the British Medical Journal. The trial is now underway, and hopefully we will have some results to share soon!