Welcome to my home page. My name is Bart and I'm an electrical engineer by education who enjoys working on projects related to augmented reality, artificial intelligence, computer vision, and games. From time to time, I'll be adding some of my work here. As you can see, I'm not much of an artist, so if you'd like to give me some pointers on how I can improve this web page, I'm all ears :)

Software


RoBart

RoBart is a multimodal LLM-powered (Claude or GPT-4) robot that uses an iPhone 12 Pro Max for its sensors, user interface, and compute! ARKit is used for SLAM as well as mapping of the environment with LiDAR. A salvaged hoverboard provides the motors, battery, and base chassis. An Adafruit Feather board communicates with the iPhone via Bluetooth and controls the motors.

The cool thing about RoBart is that there is no task-specific code. It listens for human speech and then the LLM agent figures out how to perform the task using the very limited selection of control commands exposed to it.

In addition to RoBart, I've also been learning how to use imitation learning (e.g., ACT) to control robot arms. Stay tuned for more. My initial experiment is open source and available here. If you're building in the space and looking for a collaborator, please reach out!

ChatARKit

ChatARKit is an open source experiment I built to test ChatGPT's ability to generate code using custom user-defined APIs. It allows use of natural language commands to create, manipulate, and animate 3D objects as shown in the video below.

ChatARKit works by exposing a simple augmented reality API via a JavaScript virtual machine. ChatGPT is used to generate JavaScript code from user commands. When objects are instantiated, ChatARKit attempts to find a suitable 3D model on Sketchfab. Speech-to-text is performed on-device using Georgi Gerganov's fast C++ implementation of OpenAI's Whisper model.

Rek.tv

Rek was an attempt to build the future of sports. Realizing that co-located mixed reality play with Meta Quest headsets was possible, we thought the time was right to create an augmented reality sport. We wanted to transform headsets from geeky gaming consoles into sporting equipment that one day would even work outdoors, encouraging people to connect in person and have fun in a new and exciting way. We trialed a prototype of the experience, including a live spectator mode, at two venues in Los Angeles: AR House and Two Bit Circus. Despite a very positive reception, we realized it was still too early to scale the idea.

We also prototyped a very compelling mixed reality fitness experience centered around fighting life-sized characters invading your space from another dimension. A more detailed write-up with footage is coming soon. An APK of the prototype for use with Quest 2 and Quest Pro headsets can be downloaded here and side-loaded onto devices.

Mixed Reality Laser Tag

Mixed reality experiences that build a virtual world conforming to the exact geometry of a physical space are mind-blowingly awesome. However, this has traditionally required the use of expensive motion tracking cameras costing thousands of dollars each. The capital costs for a single venue like The Void can be as high as $500K USD or more. Can something comparable be achieved using inexpensive off-the-shelf depth cameras like the $200 OAK-D?

This project proved we can get most of the way there at a fraction of the cost and I won the 2021 OpenCV AI Competition with it!

Snapchat

Snapchat's Lens Studio is an incredibly powerful augmented reality development environment. It's great for rapid prototyping, deployment to the Snap app is fast and seamless, and it has a very robust feature set. After I left Apple in 2021, I decided to kick the tires on Lens Studio by building a complete game called STRIKE FORCE! Check it out by scanning the Snap code.

HoloLens

I've spent a lot of time playing with Microsoft's amazing HoloLens. I've had mine since June 28, 2016 and you can get an idea of what I've done with it by checking out my GDC talk on the subject.

HoloLens games I've released. H.E.A.T. (left) and Doko Desu Ka (right).

Games

Presentations

Supermodel

Supermodel emulates Sega's Model 3 arcade platform, allowing a number of groundbreaking 3D arcade classics to be played on Windows, Linux, and Mac OS. Model 3 made its debut in 1996 with Virtua Fighter 3 and Scud Race and boasted the most powerful 3D gaming hardware of the era. Developed by Real3D, with a heritage reaching back to some of the earliest 3D flight simulators, Model 3 featured capabilities such as hardware transform and lighting that would not appear in consumer GPUs for several years.

Popular games running in Supermodel. From left to right: Virtua Fighter 3, Scud Race, and Star Wars Trilogy.

This project began its life way back in 2003 as a collaboration between Ville Linde, Stefano Teso, and myself. We reversed engineered the system from scratch using little more than ROM images and produced a functional but incomplete and largely unplayable emulator. The project eventually lost momentum, I went off to grad school, and Ville ported it to MAME. In late 2010, I decided to rewrite it from scratch and get it to a playable state. The first version of Supermodel was released on April 1, 2011.

Supermodel is written in C++ and uses OpenGL and SDL. On Windows, it supports force feedback and XBox 360 controllers.

libmodel3

Recently, I successfully booted up what is probably the first new code written for Model 3 in 17 years -- and certainly the first-ever written by someone outside of Sega and Real3D -- on an actual board. Originally purchased to help reverse engineer Model 3, my Virtua Fighter 3 board ended up gathering dust for years until December 2015 when I found myself home for the holidays and resolved to finally put it to use. You can read a more complete discussion here.

libmodel3 is written in C and PowerPC assembly language. Running code on an undocumented bare-metal board without the aid of any official firmware turned out to be more tricky than expected and I only got as far as displaying text using the 2D tile generator chip. To compile the code, you will need a powerpc-603e-eabi gcc cross-compiler and newlib. I built my own toolchain on Windows and would be happy to upload it. To program the ancient 27C4002 EPROMs, I used the MiniPro TL866 device programmer and a surplused UV eraser I found on eBay.

My Model 3 workspace and Virtua Fighter 3 board running the libmodel3 test program.

Really Old Stuff

I've been programming since elementary school. In high school, I learned assembly language and C, and explored a broad spectrum of topics ranging from firmware-level code to 3D graphics. Although I was fairly prolific, I produced few complete programs worth releasing (apart from a Sega Genesis emulator for DOS) because I always felt there was something new to learn about. Below are just a few of the more esoteric programs from that era lurking in my backup storage. Good luck trying to compile any of these!

  • 2D BSP trees (C, x86 assembly): a Win32 application to draw 2D maps and compile them into BSP trees, and a DOS-based 3D renderer to explore them.
  • 3D BSP trees (C, x86 assembly): a 3D BSP tree compiler and software 3D renderer.
  • Motorola 68K debugger for Sega Saturn (C, SH-2 and 68K assembly): an interactive debugger with support for single-stepping, breakpoints, setting registers, etc. The interface ran on DOS. Code was uploaded to the Saturn's audio co-processor using a Pro Action Replay catridge connected by cable to a special ISA card in the PC. I'm impressed with my younger self. If you're wondering whether I went to prom, the answer is 'no.'
  • Bitmapped graphics demo for Sega Saturn (C, SH-2 assembly): displays an 8bpp image on the Saturn using its video processor. Requires the same hardware setup as my 68K debugger. Saturn had impressively powerful 2D capabilities and I intended to write a game before moving on to other projects.

Hardware


PixArt PAJ7025R2 6dof Tracking Demo

The PAJ7025R2 is an infrared camera with integrated DSP capable of tracking mutliple IR light sources at high frame rates. Its predecessor was used in Nintendo's Wii Remote controllers. I originally wanted to use this sensor in conjunction with a HoloLens to track objects. I ordered some sample sensors and fabricated a simple PCB, and then got busy and forgot about the project for a couple years. I ended up hooking it up to an Arduino board and interfacing it with a Windows program to perform very simple pose detection. The project source code and schematics are on GitHub.

A table tennis paddle with infrared LED targets attached being tracked by a PixArt sensor connected via Arduino and USB to a Windows PC.

BartStation: A Homebrew 8-bit Video Game System

This was one of my senior projects in college. It uses an 8-bit Zilog Z80 CPU and an FPGA for VGA video output. Game code is stored on a flash ROM chip and a Sega Genesis control pad is used for input. I entered it into a regional IEEE competition in 2006 and won!

The BartStation and me at an IEEE Region 6 event in 2006.

Semiconductor Research


My doctoral work involved modeling processes that occur during semiconductor device fabrication. Fabricating integrated circuits or solar cells is like baking a very complicated inedible cake. Materials are patterned in alternating layers atop a substrate (usually silicon). Heat treatments that can range as high as 1400° C are applied to drive materials into the substrate, grow films, repair damage, or control impurities and crystalline defects.

Conducting experiments and then measuring the results is costly and time-consuming. Wouldn't it be nice if physical models could be constructed and simulated on computers? That's the idea behind TCAD. Working with Prof. Scott Dunham at the University of Washington, I developed TCAD models for oxide precipitates and dislocation loops in silicon, two types of defects that can affect both wafer strength and the electrical performance of devices. I am no longer active in this field but you can read my dissertation and a paper on my oxide precipitate model below.