NEWS

I'm currently looking for a tenure-track assistant professor position.

Contact info: shijiapan@cmu.edu

[Sept, 2018] Our workshop Data: Acquisition To Analysis (DATA), co-located with SenSys 2018, will be held in November in Shenzhen, China.
[Sept 6th, 2018] Our paper "Area Occupancy Counting through Sparse Ambient Structural Vibration Sensing" is accepted by IEEE Pervasive Computing, Special Issue - IoT Communication (Jan/Mar 2019).
[Aug, 2018] Our workshop CPD'18 Combining Physical and Data-Driven Knowledge for Ubiquitous Computing, co-located with Ubicomp 2018, will be held on Oct 12th, Singapore.
[July 1st, 2018] I started my postdoc position with ECE and CEE at CMU.
[May 25th, 2018] Our paper "Smart Home Occupant Identification via Sensor Fusion Across On-Object Devices" is accepted by TOSN 2018.
[May 15th, 2018] Our paper " IDrone: Robust Drone Identification through Motion Actuation Feedback" is accepted by Ubicomp 2018.
[May 4th, 2018] I defended my PhD thesis "Indoor Human Information Acquisition from Physical Vibration" (slides).
[April 15th, 2018] Our paper "Occupant Localization using Footstep-Induced Structural Vibration" is accepted by MSSP 2018.
[Dec 15th, 2017] Our paper "UniverSense: IoT Device Pairing Using Heterogeneous Sensing Signals" is accepted by HotMobile 2018.
[Dec 15th, 2017] Our paper "VVRRM: Vehicular Vibration-based Heart RR-Interval Monitoring System" is accepted by HotMobile 2018.
[Nov 9th, 2017] Our paper "SenseTribute: Smart Home Occupant Identification via Fusion Across On-Object Sensing Devices" got the Audience Choice Award at BuildSys 2017.
[Nov 8th, 2017] My presentation "Structure as Sensors: Learning Indoor Human Information from Physical Vibrations" got the Best Presentation Award at the Doctoral Colloquium of SenSys 2017.

Research Areas

Objects as Sensors

Human interact with their physical surroundings all the time. By monitor the vibration response of the physical objects, we can infer the information of the interaction.

Vision-based Mobile Sensing

Most of the mobile phones are equipped with cameras. Vision-based sensing on mobile phones can extend the sensing ability of the phones.

Indoor Localization

Indoor localization provides location information of people/devices. This information enables various smart building applications, e.g., navigation, personalization.

Objects as Sensors

Infer human-object interaction through strutural vibration

Pic 01

Occupant Identification and Tracking using Footstep-Induced Floor Vibration

Many smart building applications need non-intrusive monitoring of their occupants. We present a structural vibration based sensing system to achieve this purpose. We focus on pedestrian spatio-temporal information obtaining including identity (paper), location (paper), traffic (paper), health condition (paper), etc.

Pic 02

Surface Vibration Sensing: Turn Everyday Objects into Touchscreens

People touch objects in the physical surroundings all the time. Touch interaction induces surface of the objects vibrate. By detecting this vibration at multiple locations, the system can localize and track interaction points, including tapping and swipping, hence turn the monitored surface into touch screen (paper).

<

Vision-based Mobile Sensing

Infer user/device information using camera on mobile phone.

Pic 01

Polaris/Headio

Mobile device orientation obtaining utilizing ceiling pattern captured by front facing cameras.

Pic 02

Securitas

Ubiquitous mobile identification through hand geometry utilizing a NIR-RGB camera.

Pic 03

iCEnergy

A vision-based mobile information interface that provides power monitoring using augmented reality.

Indoor Localization

Obtain person/device location in an indoor environment.

Pic 01

SensorFly

RF-based mapping and navigation for small-scale quadcopter swarms.

Pic 02

PANDAA

Acoustic-based devices and sound sources simultaneous localization.

Pic 01

vLoc

Indoor tracking using Wi-Fi and inertial sensors on mobile phones.

Pic 02

Nataero

Large amount and long period indoor assets tracking using BLE.