Chrome Dino Game with Palm Detection#
Introduction#
In this tutorial, we will demonstrate how to use the AsyncAccl Python API to control the Chrome Dino game in real-time on the MX3 chip, utilizing MediaPipe’s palm detection model. By detecting hand gestures, you’ll be able to make the dinosaur jump using your palm.
Note
This tutorial assumes a four-chip solution is correctly connected.
To ensure the best results, this demo is designed to work with only one person in the frame. Make sure only one hand is visible on the screen to trigger the dinosaur’s jump.
Download the Model#
The MediaPipe Palm detection model can be downloaded from the Official GitHub page. In this tutorial, we will use the Palm detection model: TFLite model (lite)
version.
For convenience, we have provided an exported version of the model, which is available in the compressed folder: Dino_Game_Python_MX3.zip
.
Demo#
Compile the Model#
The MediaPipe Palm detection model includes a post-processing section in its graph. Therefore, it needs to be compiled using the neural compiler’s autocrop
option. After compilation, the compiler generates a DFP file for the main model (palm_detection_lite.dfp
).
Since the post-processing section only reshapes and concatenates the output, it is handled in our post-processing function for simplicity.
The compilation step is generally required only once and can be performed using either the Neural Compiler API or the Tool.
Hint
To skip the compilation step, you can use the pre-compiled DFP file included in the Dino_Game_Python_MX3.zip
.
Compile the model using the API:
from memryx import NeuralCompiler
nc = NeuralCompiler(num_chips=4, models="palm_detection_lite.tflite", verbose=1, dfp_fname = "palm_detection_lite", autocrop=True)
dfp = nc.run()
Compile the model from the command line:
mx_nc -v -m palm_detection_lite.tflite --autocrop -c 4
After compilation, point the dfp
variable in your Python code to the generated file:
dfp = "palm_detection_lite.dfp"
To use the pre-compiled DFP, point the dfp
variable in your Python code to the provided file path:
dfp = "palm_detection_lite.dfp"
Installation#
Follow these steps to set up the environment for the Dino game demo.
Install the PyAutoGUI package to automate key presses:
pip install pyautogui
If you’re on Linux, install wmctrl to keep the video feed window on top:
sudo apt-get install wmctrl
CV Initializations#
Begin by importing the necessary libraries, initializing the CV pipeline, and defining common variables:
import subprocess
import os
import time
import cv2 as cv
import numpy as np
import pyautogui
import sys
import signal
import threading
from queue import Queue
# Disable the PyAutoGUI fail-safe feature
pyautogui.FAILSAFE = False
# Global variables
jump_cooldown = 5 # Frames to wait between consecutive jumps
cooldown_counter = 0 # Frame counter for jump cooldown
hand_detected = False # To track the state of hand detection
chrome_process = None # To track the Chrome process
display_frame = None # Store frame for display in a separate thread
stop_display = False # To signal when to stop the display thread
Model Pre-/Post-Processing#
Pre- and post-processing are implemented using a helper class MPPalmDet.py
, included in the tutorial’s compressed folder. You can easily integrate it into your project:
from mp_palmdet import MPPalmDet
Define an Input Function#
We will define an input function that captures frames from the camera and pre-processes them for the accelerator:
# Function to capture and preprocess the frame
def get_frame_and_preprocess():
"""
Captures a frame from the camera, preprocesses it for the model, and returns it.
"""
global display_frame
got_frame, frame = cam.read()
if not got_frame:
print("Error: Could not capture frame from camera.")
return None
# Store the frame for the display thread
display_frame = frame.copy()
# Put the frame in the queue to be used later
cap_queue.put(frame)
# Preprocess the frame for the model
return model._preprocess(frame)
Define Output Functions#
Next, define an output function to process the accelerator’s output. This function checks for palm detection and simulates a jump in the Chrome Dino game by pressing the spacebar using PyAutoGUI. A cooldown mechanism prevents multiple jumps from being triggered:
# Function to handle the post-processing and game control logic
def postprocess(*accl_output):
global cooldown_counter, hand_detected
frame = cap_queue.get()
# Reshape and concatenate outputs from the accelerator
accl_output_1, accl_output_2 = reshape_and_concatenate(accl_output)
# Check if palm is detected
is_hand_detected = model._postprocess(accl_output_1, accl_output_2)
# Handle jump logic with cooldown
if is_hand_detected and not hand_detected and cooldown_counter == 0:
pyautogui.press('space') # Simulate a jump in the game
cooldown_counter = jump_cooldown # Start cooldown
hand_detected = True # Update hand detection state
elif not is_hand_detected:
hand_detected = False # Reset hand detection if no hand is detected
# Decrease cooldown counter if it's active
if cooldown_counter > 0:
cooldown_counter -= 1
Connect the Accelerator#
Connect the input and output functions to the AsyncAccl API. The API will manage the pipeline automatically:
from memryx import AsyncAccl
dfp = "palm_detection_lite.dfp"
# Initialize model and queue
model = MPPalmDet()
# Initialize the accelerator with the model
accl = AsyncAccl(dfp=dfp)
# Connect input and output functions to the accelerator
accl.connect_input(get_frame_and_preprocess)
accl.connect_output(postprocess)
# Wait for the accelerator to process
accl.wait()
The accelerator will automatically invoke the input and output functions in a fully pipelined manner.
Usage#
Run the application by executing the following command. It will automatically open the Chrome Dino game. Click on the game screen and press the spacebar to start the game:
python app.py
Third-Party License#
This tutorial uses third-party software and libraries. Below are the details of the licenses for these dependencies:
Model: From Media_Pipe GitHub
License: Apache License 2.0
Summary#
This tutorial demonstrated how to use the AsyncAccl Python API to control a real-time game using a model on the MX3 chip. All the code and resources used in this tutorial are provided in the download: