Saturday, 20 December 2014

Introducing BioAcoustica

In a recent blog post I explained some of the work that had been done on the Wildlife Sound Database. That project has now been developed further to include annotations of audio samples and integrated analysis functions. Given the expansion of the project it has also been renamed BioAcoustica.

Recordings of wildlife sound often include sections of spoken introductory metadata detailing where and when the recording was made, the identification of the species being recorded, etc. They often also containing sections of extraneous noise, such as passing cars, mechanical noise caused by the recording equipment, etc. These sections of the recording should be ignored by any analyses performed on the recording.

The aim of the annotation function of BioAcoustica is to allow the usable sections of audio to be clearly marked as in the waveform below.

Call of the cicada Platypleura haglundi annotated to show regions of voice introduction (blue), extraneous noise (red) and the call of the cicada (green, two overlapping regions). (Source: http://bio.acousti.ca/node/11778)

By default BioAcoustica will perform a set of standard analyses on regions annotated as being calls of the organism.

Spectrogram from the short section of the call of Platypleura haglundi from the above annotation example.


Analysis makes use of infrastructure provided by the EU funded BioVeL (Biodiversity Virtual e-Laboratory) project. Requests for analysis are sent to the BioVeL portal which manages the queuing and execution of analysis tasks.

One of the aims of BioAcoustica is to provide an annotated dataset that allows for usable sections of calls to be instantly available to researchers working on bioacoustics, automated identification, or any other project. For this reason we are currently working on a R package that will allow for the querying of annotations, download of the files, and extraction of relevant regions for use in your own analyses. We will also document the API we are using so it can expanded to other analysis and development environments.

Tuesday, 2 December 2014

Getting the seewave R package installed on Mac OS X Yosemite

The seewave package for R first requires you to install the fftw library (on your system, not in R), instructions for this can be found here: Installing fftw on Mac OS X Yosemite.

At present it is not possible to use install.packages() to download seewave on the 3.1 branch of R. To see what version(s) of R you have on your system run the following command from the Terminal:

ls -l /Library/Frameworks/R.framework/Versions/

This will give an output similar to:

drwxrwxr-x  6 root  admin  204  2 Dec 11:31 3.0
drwxrwxr-x  6 root  admin  204  3 Nov 16:50 3.1
lrwxr-xr-x  1 root  admin    3  2 Dec 11:31 Current -> 3.0


This shows that my system has two versions of R, 3.0 and 3.1, installed and the current default version is 3.0. If you do not have 3.0 installed you can download R version 3.0 branch from here.

To set the current version of R to 3.0 if you have it already installed you just need to change the symbolic link 'Current' to the correct version:

cd /Library/Frameworks/R.framework/Versions 
ln -sfn 3.0 Current

You can then use install.packages() in R to install the seewave package:

install.packages(c("fftw","tuneR","rgl","rpanel"), repos="http://cran.at.r-project.org/")

install.packages("seewave", repos="http://cran.at.r-project.org/")

Installing fftw on OS X Yosemite

FFTW is a C library for Discrete Fourier Transforms. The following instructions allow you to install the library on Mac OS X Mavericks. All commands are run from Terminal.

First of all ensure the xCode Command Line Tools are installed:

xcode-select --install

Download the fftw source and change to its directory.

Configure, make and install the library (no need to use sudo).  In this example we enable both floating point and threads.

./configure --enable-float --enable-threads

(If this gives an error that the 'compiler cannot create executables' you will likely need to accept Apple's terms and conditions for xCode)

make

make install
 

Mac OS X: compiler cannot create executables

This issue is often due to you not having accepted Apple's latest terms and conditions. Running the following line from the command line will guide you through this process:

sudo gcc -v

Sunday, 16 November 2014

Wildlife Sound Database

Over the past few months I have been working with some volunteers at the Natural History Museum to make the museum's collection of recorded insect sounds available online. This work-in-progress can be viewed online at the Wildlife Sounds Database. The collection reflects the research interests of the BMNH Acoustic Laboratory during the 1970s-1990s: the bulk of the collection relates to European Orthoptera (grasshoppers and allied orders).

The Wildlife Sounds Database website


Platform
The Wildlife Sound Database is a modified instance of the Scratchpads virtual research environment (Smith et al, 2011).

Use of collection
This collection contains the raw data underpinning a number of scientific publications (e.g. Ragge, 1987; Ragge & Reynolds, 1988). The recordings may also be used for future taxonomic work (in some instances voucher specimens exist in the NHM Entomology collection) and as a training set for machine learning algorithms.

Exposing datasets
Recordings with compatible licences are shared automatically with the Enyclopedia of Life through the Wildlife Sound Database Collection. The dataset is also available as a DarwinCore Archive at http://sounds.myspecies.info/dwca.zip (Baker, Rycroft & Smith, 2014).

WildSounDB audio files flowing to Encyclopedia of Life

Current development
Current development of the Wildlife Sound Database is through the NHM funded project Developing the NHM Wildlife Sound Archive. This project will develop annotation tools to annotate sections of recordings, allowing differentiation between voice introductions and different types of calls, as well as between calls of different species on the same recording. In addition we will integrate analysis tools using the seewave package for R.

As well as technical developments this project will increase the number of recordings available through the addition of new projects to the Wildlife Sound Database.

The Godfrey Hewitt collection (University of East Anglia) will be digitised and made available (this is likely to contain recordings underpinning important works on hybrid zones and post-glacial recolonisation of Orthoptera).

The Godfrey Hewitt Collection: reel-to-reel tapes to be digitised
The Global Cicada Sounds Collection will include cicada recordings from around the world.

Saturday, 6 September 2014

Arduino the Documentary

An interesting documentary (2010) on the Arduino prototyping platform.

Monday, 1 September 2014

Arduino enthusiasts: ATMEL mini video series on the history of the AVR

ATMEL has released a few YouTube videos on the history of the AVR RISC (reduced instruction set) micro-controllers - the chips that these days power a huge number of devices including the increasingly popular Arduino.



Monday, 4 August 2014

Triggering GoPro using a webcam

This project makes use of a cheap webcam and a Raspberry Pi (purchase) to provide a motion detection unit capable of triggering a GoPro Hero 3+ Black Edition (purchase) camera connected to the Raspberry Pi using WiFi (using a small WiFi adaptor).

Using this setup it is possible to achieve a number of things that cannot be achieved by using a motion detection system based solely on the webcam.
  1. The GoPro camera has much better resolution and clarity than the cheap webcam used here.
  2. Burst mode on the GoPro has rates of up to 30 frames per second, great for photographing wildlife.
  3. The motion detection area can be different to the area where the photograph is taken. The arrangement below used during testing shows a system that is triggered by the cat (Phoenix) coming up the stairs, but the GoPro is situated in a slightly different location. Imagine the webcam being used to detect motion in the area around a bird feeder, with the GoPro focussed entirely on the bird feeder.
Test setup, the Raspberry Pi and cheap webcam (left) detect the cat as it reaches the top of the stairs. The Raspberry Pi then triggers the GoPro (set in burst mode) to take a quick succession of photographs as the cat walks past.

For this project I used the Raspbian Linux distribution on the Raspberry Pi but similar instructions will apply on other distributions. For setting up the project connect the Pi to a wired network as both USB ports are used (one for webcam, the other for the WiFi adaptor).

Instructions
These assume you are running a *nix system.
  1. Turn on the GoPro and enable its WiFi functionality (GOPRO APP).
  2. Connect the WiFi adaptor to the Raspberry Pi. Connect to the Raspberry Pi from your computer (ssh -X pi@xx.xx.xx.xx) and configure the Pi to connect to the GoPro's WiFi network (easiest way is to run wpa_gui).
  3. Install the motion application and php: sudo apt-get install motion php5-cli
  4. Make it so motion can run without errors: chmod +r /etc/motion/motion.conf
  5. In the /etc/motion/motion.conf file change the following lines:
    target_dir             /home/pi/motion
    on_start_event         php /home/pi/goprophp.php SHOOT
  6. Download and extract the GoProPHP tool into your home directory (or clone the project using git).
  7. You can test that everything is setup correctly by putting the GoPro into burst mode:
    php /home/pi/GoProPHP/goprophp.php BURST
  8. To launch motion when the Pi is powered on add motion after the comments in /etc/rc.local
  9. Plug in the webcam and restart the Raspberry Pi. If all goes well you have your motion-detecting camera working.
The /etc/motion/motion.conf has numerous options you can use to customise the behaviour of the system. Video can be recorded using the START and STOP commands of GoProPHP and the on_event_start and on_event_stop lines in motion.conf.

Here is the first photo of Phoenix taken using the setup illustrated above, obviously potential for better lighting but it illustrates how the two-camera solution may be useful.


Monday, 28 July 2014

Getting the seewave R package installed on Ubuntu

On Linux installing seewave requires the installation of a few additional libraries. Assuming you already have R installed you will need to run the following from the system Terminal:

sudo apt-get install libfftw3-3 libfftw3-dev libsndfile1 libsndfile1-dev r-cran-rgl bwidget

The current installation instructions for seewave do not include the installation of the BWidget package which causes issues when you try to load the package in R: [tcl] can't find package BWidget. You may then proceed to install the seewave package within R:

install.packages("tuneR, "seewave");

Sunday, 29 June 2014

Lyme Regis Fossil Festival 2013: Ammonites Video


Recorded by John Cummings as part of the Wikimedian-in-Residence programme at the Natural History Museum (May 2013) featuring Paddy Howe of the Lyme Regis Museum.

Experimenting underwater with a GoPro

Having moved the book writing endeavour to Lyme Regis for a couple of weeks to spend some time with Rikey from Alice's Teddy Bear Shop and Paddy from The Fossil Workshop it seemed a good time to put the new toy, a GoPro Hero 3+ Black Edition to the test underwater.

Here are a couple of videos of rock pool life with thanks to Leon (Rikey and Paddy's so) for assistant cameraman duties.




Rounding off the sea-life theme I visited the Charmouth Heritage Coast Centre to meet Phil, a friend who  used to work at the Natural History Museum to record some footage of a cuttlefish the centre had raised from eggs.


The scraping noise and final tipping of the camera were caused by an Edible Crab investigating, and then burrowing under the camera as you can see below.

Friday, 6 June 2014

Potential problem with Scrivener and Dropbox

"Unexpected tag  on line 1.  Expected tag ScrivenerProject" @ 1 0


On the Linux version of Scrivener the conflicted versions of the *.scriv files in the project folder may prevent Scrivener from opening that file. The only indication of an issue is in the Terminal (a good bet for debugging Linux GUI apps is to launch them from the Terminal).

The problem is easily fixed by removing the extraneous .scriv files - just make sure you keep the most recent version.

Prevention is as simple as not having the same Scrivener project open on two computers at the same time.

Saturday, 24 May 2014

Arduino for Biologists (I'm writing a book)

Here's a description:

Technology has become crucial to biological science, from sensor networks to satellite imagery. These technologies are often expensive and difficult to modify, even if they do suit your specific needs. In contrast, multi-purpose platforms, coupled with open-source software, increasingly offer the biologist a way to set-up and use off-the-shelf components and code to create a customised, but affordable biological recording systems. The Arduino is a hugely popular, low-cost platform, that allows you to create your own technological solutions to problems quickly and easily with little prior electronics or programming knowledge.
The Arduino system comprises a development board containing a small micro-controller that can be reprogrammed via the Arduino desktop software and a USB cable. The system is easily expanded to add Internet connectivity, data recording tools and your own custom electronics.
This book explains everything you need to know about Arduino and how to use it. It starts with an introduction to practical electronics and the Arduino platform, and moves on through a series of projects solving specific biological problems. Each of these projects can be used as described, or modified for your specific needs. In addition, information is provided on how to share and publish your own designs and integrate them with the global infrastructures that support biodiversity informatics.
This book is for anybody who wants to design and build their own technology-based research equipment, whether it is for use in the laboratory or the field. It is of particular relevance to those who wish to develop their own sensor networks, whether they are professional ecologists or citizen scientists.
Ed Baker's work focusses on acoustic sensor networks for monitoring singing insects, remotely monitoring environments and how digital tools can help the process of science from data collection to publication.
There's a page here where I'll be putting links to relevant things: Arduino for Biologists.

Any suggestions/ideas/comments/requests welcomed!

Saturday, 17 May 2014

Living ammonites: recreating a Jurassic sea


Last year Paddy from Lyme Regis Museum (well worth a visit) took over the window of Alice's Bear Shop (well worth a visit) for Lyme Regis Fossil Festival. Paddy also runs the Fossil Workshop below the bear shop (you guessed it - well worth a visit).

Seeing this re-ignited a much talked about idea among the Buckland Club of doing something to recreate a Jurassic Sea. The opportunity provided itself with Adrian, Maggie and Helena from the NHM bringing the underwater rover REX to this year's Fossil Festival (along with a replica ammonite and belemnite).

Maggie and I preparing the replicas for the sea

After attaching the creatures to some large weights Adrian ventured out to return them to the sea.


After waiting for the next day's morning high tide we launched REX and went to see if we could find the 'living' animals.

REX the remotely operated vehicle

Launching REX off of the Cobb at Lyme Regis

REX finds the 'living' ammonite. Now that's a Nature paper, surely?

Wednesday, 14 May 2014

BDJ Paper on Encyclopedia of Life

A new paper where I acted as editor in the Biodiversity Data Journal: The Encyclopedia of Life v2: Providing Global Access to Knowledge About Life on Earth.

Abstract:
The Encyclopedia of Life (EOL, http://eol.org) aims to provide unprecedented global access to a broad range of information about life on Earth. It currently contains 3.5 million distinct pages for taxa and provides content for 1.3 million of those pages. The content is primarily contributed by EOL content partners (providers) that have a more limited geographic, taxonomic or topical scope. EOL aggregates these data and automatically integrates them based on associated scientific names and other classification information. EOL also provides interfaces for curation and direct content addition. All materials in EOL are either in the public domain or licensed under a Creative Commons license. In addition to the web interface, EOL is also accessible through an Application Programming Interface.
In this paper, we review recent developments added for Version 2 of the web site and subsequent releases through Version 2.2, which have made EOL more engaging, personal, accessible and internationalizable. We outline the core features and technical architecture of the system. We summarize milestones achieved so far by EOL to present results of the current system implementation and establish benchmarks upon which to judge future improvements.
We have shown that it is possible to successfully integrate large amounts of descriptive biodiversity data from diverse sources into a robust, standards-based, dynamic, and scalable infrastructure. Increasing global participation and the emergence of EOL-powered applications demonstrate that EOL is becoming a significant resource for anyone interested in biological diversity

PDF can be downloaded here: The Encyclopedia of Life v2: Providing Global Access to Knowledge About Life on Earth.

Tuesday, 11 March 2014

Open source data logger - towards an Open tool chain for biodiversity science

This is a paper I published recently as the first Open Hardware publication in the (still new) Biodiversity Data Journal: Open source data logger for low-cost environmental monitoring.

The device described is a basic environmental (temperature, humidity) data logger with Internet connectivity and also SD card storage. It could easily be run alongside traditional sampling methods such as malaise or pitfall traps, or combined with camera trapping to provide abiotic data for use alongside specimen/observation datasets.

Using devices such as this we can build a chain of Open tools that manage biodiversity data from the point of collection. For example combining this project with an open source camera trapping project (coming soon) we could collect observational and corresponding environmental datasets using Open Hardware (apart from the Camera in this example), automatically upload these data to a Virtual Research Environment such as Scratchpads (Open Source) where species could be identified online by experts around the world and the raw dataset curated. A manuscript could be collaboratively authored either in the Scratchpad or the Pensoft Writing Tool and published as an Open Access data paper in the Biodiversity Data Journal. Over time I hope that other tools may be switched in to or out of this tool chain as appropriate - but progress is being made.

Further reading:

Tuesday, 28 January 2014

Uploading Arduino sketch from Raspberry Pi (command line)

First of all we will need to download some packages (instructions assume the use of Raspbian) to allow us to compile Arduino compatible code (arduino-mk) and upload it (avrdude) to the Arduino.

sudo apt-get install arduino-core arduino-mk

Then add yourself to the dialout group to allow you to upload code to the Arduino board

sudo usermod -a -G dialout pi (Change pi to be your username if different)

Logout and then log back in again to ensure you are recognised as a member of the dialout group.

To compile an Arduino program it is neccessary to create a file called Makefile that is used to specify parameters for the compilation.

touch Makefile

Open the file for editing:

nano Makefile

And paste in the following text:

ARDUINO_DIR = /usr/share/arduino
BOARD_TAG    = uno
ARDUINO_PORT = /dev/ttyACM*
ARDUINO_LIBS =
include /usr/share/arduino/Arduino.mk


You may need to change the BOARD_TAG to reflect the version of the Arduino you are using

You can test that your sketch compiles by running make and ensuring that there are no errors.

To upload your compiled sketch to the Arduino:

make upload

This process allows for programming the Arduino without using a graphical user interface, and also allows updates to Arduino code to be propagated to devices using scripts (assuming the Arduino is connected to the Pi via USB).

Monday, 27 January 2014

Running GUI applications from another computer on Raspberry Pi (or vice versa)

The Raspberry Pi is a great piece of kit, however it's not particularly suitable for CPU or memory hungry applications (e.g. Firefox). There may also be times that you want to run an application you have on another computer on your Pi without having to install it.

As long as you are able to SSH onto a computer where that application is installed (and it can run the X Window System  - Linux will almost certainly do this out of the box) you can use a feature of the X window system to run the application on the remote computer but route the graphical window to your Pi (where it will behave - pretty much - as a standard graphical program.

To do this you must first be in a graphical environment on your Pi, for Raspbian this is started from the command line by typing startx. From here start the Terminal emulator, and log in to the remote computer over SSH using the -X switch:

ssh  -X username@192.168.0.2

Replace username with your username (default on Raspbian is pi) and 192.168.0.2 with the IP address of the remote computer.

You should now have what looks like a standard SSH terminal to the remote computer, however if you run a graphical program on the remote system the output will be displayed on the Pi. If you are logging into an Ubuntu/Debian or otherwise Gnome-based computer try running gedit as a quick test.

This also works in reverse, so you can run graphical programs on the Pi and have them display on your Linux desktop.

Running the Midori browser on my Raspberry Pi with the output redirected to an Ubuntu desktop using 'ssh -x'

Saturday, 25 January 2014

Using public/private keys to log in to Raspbian

This assumes working from a *nix environment, and that you haven’t previously created a public/private key pair (e.g. for using Github). If you are unsure type cat ~/.ssh/id_rsa into the terminal - if you get a load of characters ending in "-----END RSA PRIVATE KEY-----" you already have a key that you can use - go straight to Step 2.

Step 1: Generating a public/private key pair
This is as easy as:

ssh-keygen -t rsa -C "name@email.com

(Obviously replace name@email.com with your own email address).

The system may ask you whether you want to create a passphrase for this key, if you do you will be asked to enter this passphrase the first time you use the key in any given session. If you do not want to set a passphrase just hit enter.

Step 2: Getting your key to the Raspberry Pi
This is again a simple one line command that logs into Raspbian using a password and adds your key to the list of authorised keys on the system.


cat ~/.ssh/id_rsa.pub | ssh pi@10.0.2.2 "mkdir .ssh; touch .ssh/authorized_keys; cat >> .ssh/authorized_keys"

You will need to change 10.0.2.2 to the IP address used by your Raspberry Pi. You will be prompted for the password of the 'pi' user on your Raspberry Pi. If you then use the exit command to leave the pi and return to your *nix system you hsould now be able to log on simply by typing:

ssh pi@10.0.2.2

If you used a passphrase to protect your keys you will be prompted to enter this the first time you use the key in any session.

Thursday, 23 January 2014

Citizen Cyber Summit 2014 (Feb 20-22 2014)

An interesting event hosted by the Royal Geographical Society and University College London: Citizen Cyber Summit 2014.

I will be there (although not presenting) along with a number of Natural History Museum colleagues, and a number of people from the ViBRANT Citizen Science Workshop I helped organise about a year ago.

Battle hardening Raspberry Pi 1: the 'Fork Bomb'

I have been thinking recently of using a number of Raspberry Pis as a platform for doing a number of biodiversity science projects. This would (possibly) require allowing people external to the project running code on the devices, so I have been looking into what needs to be done to 'battle harden' the standard Raspbian distribution for use in this, or similar projects.

Let's just say that the Pis may well end up in a number of inaccessible locations, and it might end up being very difficult to hard reset them. In my mind the first thing I think of when letting people access such a system is the 'Fork bomb' - creating an ever increasing number of processes (accidentally or otherwise) until the system runs out of resources.

Perhaps the best known *NIX fork bomb is the following:

:(){ :|: & };:

This innocuous looking code defines a function with no parameters : that calls itself and pipes the output from calling itself into a new background process { :|: &};. The final : runs the function for the first time.

The solution to this is to specify a maximum number of processes in the file /etc/security/limits.conf - adding the following line sets a limit for all users apart from root:

*       hard             nproc           1024

On my 512MB Raspberry Pi this easily prevents the fork bomb from causing havoc - although for most circumstances I will need a lower limit (e.g. 512) is likely to be sufficiently adequate.

Monday, 13 January 2014

Darwin Core Archives

In a presentation I gave not so long ago (Building Highways in the Informatics Landscape) I suggested that Darwin Core Archive (DwC-A) was the lingua franca of biodiversity informatics. A position that I still stand by. However it's a lingua franca with different dialects - and implementation is not quite as simple as it perhaps could (should?) be. In a recent paper (Linking multiple biodiversity informatics platforms with Darwin Core Archives) in the new Biodiversity Data Journal I, along with Simon Rycroft and Vince Smith, set out some of the challenges in making a 'several dialects' DwC-A that satisfies the needs of all current DwC-A consumers of the Scratchpad project.

ShareThis

Copyright