What is this demo?

BirdNET Live is an experimental Progressive Web App (PWA) that demonstrates the capabilities of modern web browsers to run sophisticated machine learning models. Unlike the official BirdNET app which runs natively on Android/iOS, this version runs inside Chrome, Safari, or Firefox using TensorFlow.js.

This means you can identify birds on your laptop, tablet, or phone without installing an app store application, and—crucially—without sending any audio data to a server.


What is BirdNET?

BirdNET is a research platform developed jointly by the K. Lisa Yang Center for Conservation Bioacoustics at the Cornell Lab of Ornithology and the Chemnitz University of Technology.

The project uses artificial neural networks to train computers to identify more than 6,000 of the most common bird species worldwide. It is one of the most advanced tools for acoustic monitoring of birds.

Visit Official BirdNET Website

How the Detection Works

1
Audio Capture

Your browser's Web Audio API captures sound from your microphone and resamples it to 48,000 Hz.

2
Spectrogram Generation

The audio is sliced into 3-second chunks. These chunks are converted into Mel Spectrograms—visual representations of sound that show frequency intensity over time.

3
Neural Network Inference

The spectrogram image is fed into a Convolutional Neural Network (CNN). This model has learned to "see" bird calls in these images, distinguishing them from noise and other animals.

4
Post-Processing

The model outputs probability scores for thousands of species. We apply a confidence threshold (e.g., 25%) and optionally a location filter to hide birds that are unlikely to be found in your area.


Contact

Questions about BirdNET research, tools, or collaborations:

Email:
ccb-birdnet@cornell.edu


Privacy Policy & Legal Notice

BirdNET Live on GitHub

The source code for BirdNET Live is available on GitHub. Feel free to inspect the code, report issues, or contribute improvements!

View Repository