Portfolio

My portfolio showcases a range of HCI and Human Factors research, alongside my programming expertise.

Joshua Duvnjak

Research

My research focuses on human behaviour typically with novel computer systems.

Conference Paper Link

Personalisation Design Cards

This study consisted of the generation of a novel set of ideation cards and getting participants involved with using them to codesign their own personalisation systems.

The cards were used in workshops and the data (qualitative) was analysed using a a thematic analysis.

Codesign Worksops Thematic Analysis

Journal Paper Link

Stakeholder Questionaire

This study consisted of an experimental questionnaire that looked to understand how manufacturing stakeholders felt about personalisation.

The questionnaire had multiple conditions with a mixed-subjects design to test different robotic personalisation system examples.

Questionaire Statistics Experimental Design

GitHub Repo Link

Augmented Reality Smartphone App

An Augmented Reality smartphone technology probe was selected as the method.

The technology probe itself was designed through the use of User Persona and Scenario Techniques, with an iterative design approach.

It was implemented using the Unity and Vuforia AR package as a foundation.

C# Development Mobile App Development User Evaluation User Personas

Programming

I have experience is programming a wide range of systems from Augmented Reality to Interfaces.

GitHub Repo

Sensor Based Hat

The project is an inactivity monitor and alarm system. It tracks user movement and provides a visual alarm when they have been inactive for too long.

It does this by combining different sensor data to determine whether a user is walking and whether they have changed room, these link to show whether a user is outside.

The system was tested and developed in multiple ways by collecting real-world data sets. The system is mainly interacted by a person through the sensors in the system.

Python Programming Wearable Development RaspberryPi Developemnt

GitHub Repo Link

Robotic AI Assistant

A PiCar-X AI platform was used to create an emboided AI assistant. The robot moves around and uses an ultrasonic sensor to detect objects.

It then tries to search for uses using a computer vision library. If a person is found, the robot then switches to a voice interface and uses Google's Gemini to answer questions.

AI API Python Programming Voice Interface

Website Developed by Joshua Duvnjak
Using the Boostrap Library