The objective of this overview is to give a brief insight into the work being done on Zooniverse by three different nature focused projects. Two are attempting to track the populations of wildlife in Australia and Wisconsin and one is researching human evolution through the study of chimpanzees in the wild. I will begin by introducing the projects in a quick overview, I will then outline the process undertaken, move onto what I learned from the projects and finish with if I think I can use the crowdsourcing platform, or Zooniverse, for my own work.
Overview of the Projects & Potential Implications of Contributed Work
Despite my best efforts, it is very difficult to find a crowd-sourced project relating to my minor (Economics).
As such I decided to look and contribute to a project that I found interesting. I found two very similar projects at the beginning and in this overview I will draw parallels between these two projects and a third also nature based project from Zooniverse. The first two very similar projects are Snapshot Wisconsin and Western Shield.
Snapshot Wisconsin is a project that tries to understand wildlife populations in Wisconsin through trail camera monitoring. This Government initiative gives trail cameras to volunteers and asks them to retrieve the SD card data at least 4 times a year, for it to be uploaded to Zooniverse. The Zooniverse project then asks online volunteers to tag each animal they see in a series of 3 stills taken from when the camera was activated through movement and heat-seeking technology. The animals are tagged from a large list of Wisconsin wildlife. Using this data, maps can be made for common and rare species across the state. Changing animal populations can also be mapped. This project also asks you to tag the number of animals in the stills and select from a small list what they may be doing (foraging, moving etc).
Western Shield is an Australian project that also asks users to tag animals from a list of Western Australian animals. The project was set up by the government to protect small and medium sized mammals from the reintroduction of foxes and cats in Western Australia. Although these predators are vital for the ecosystem to thrive, they can reach fenced off areas of land and damage struggling species if not controlled. The project has set up cameras in areas with fox or cat cullings underway, and areas without any such measures (controls) to compare animal numbers in each. In this project you can mark the number of animals but not their actions.
The third project I took a look at was the Chimp & See project. This ambitious project tells us it is trying to find out more about the origin of humanity from the behaviour of chimpanzees captured from trap cameras in 15 African countries.
The idea of the project is that the major leap in homosapien evolution was the use of tools, hunting and meat eating. Capturing these behaviours in chimps will, hopefully give an ideas to the events that cause such a leap behaviour change to occur. This project varies from what I have already written about in 3 simple but important ways: 1: You are given 15 second video clips of the trap camera footage rather than three still images to flick through (though you can choose to see 9 still images if you wish) 2: The action selectors are much more precise – tool-using, mating, fighting etc 3: The project was more interactive and educational than the other two, which made working on it less of a chore.
Firstly I signed up to Zooniverse and came across Snapshot Wisconsin. SW grabbed my attention because the wildlife is not too dissimilar to our own Irish Wildlife. I set about tagging animals from my sets of still images and quickly found that there are an abundance of deer in Wisconsin. If you don’t think it’s a deer, then look closer, it’s still a deer. ` The team at SW apparently knew of this when designing the program, because they only give the action list I previously mentioned when you click on deer. A badger, fox and feral pig also came onto the scene for their share of the limelight, but deers rule the program. This is when I decided to try the other project with a very similar specification, and see if there was more to be found. I opened up Western Shield and was struck by how underdeveloped the application seemed to be in comparison to the first project. There were less animals which could be chosen from, with larger groupings (‘rodents’, ‘reptile or frog’ rather than specific animals) and uncapitalised names for the animals (a small difference, but made a significant aesthetic change). All of that said there was more variety in the stills I was given to mark. Kangaroos, possums and quokka were very interesting to see captured on camera. As I previously mentioned the two of these projects had very similar interactivity processes. While Zooniverse is a powerful tool; the similarity between all of these projects might betray its limitations of functionalities.
The third tool had a more interesting workflow. You were given a field guide from the outset so that you could make educated guesses about the species, behaviour, age and gender of animal you were identifying. You could ID and name chimps after you identified them which is a nice addition – but is quite difficult to do most of the time due to the poor quality of the cameras. These projects were a fun experience overall but there were a few issues that I will elaborate on in the next paragraph.
What did I learn and what could have been better?
The projects were good at drawing attention to the issues they were trying to address. To take part in a project you have to acquaint yourself with the aim of the project. As an educational tool, I learned that Zooniverse is quite useful. I could see the applications being quite popular in classrooms for instance. The work was simply not engaging. A lot of the images had nothing there, and had been taken because a change in light had triggered the motion capture, or a leaf blew across the screen. This resulted in a lot of drudgery. If the work isn’t very stimulating then how can the publishers expect to receive vast amounts of responses back? This ties in with the second issue, a more theoretical than concrete one. The developers state that your work will be cross referenced with the work of other respondents; thus you are encouraged to ‘give every image your best guess’ even if you may be shooting in the dark. If the research teams are relying on the power of the crowds to give them their results, they should be worried if there is little interaction on the application from unique users (most of the time the home screen read 0 people are talking about this: for all projects). I learned that in making a Zooniverse tool I would be careful to make it as engaging as possible (by making people feel as though they are a part of something important, like Chimp and See) and that if you need a high volume of respondents; more emphasis should be given to the ongoing nature of the project and an amount of marketing should be employed.
Could I use this for my own work?
The brief for this assignment asked that we incorporate our minor into the work we do. Despite my searching I could not find anything at all (on Zooniverse or otherwise) economically related which can be contributed to by the masses. Perhaps this is because economics is still shrouded in a frumpy and scholarly cloak, and people simply wouldn’t want to contribute their knowledge or data to any projects. With the advance of popular economics though, I doubt this is the reason. If economists are looking for datasets to test hypotheses against then surely we can hope to see crowdsourcing in some form other than the census at some point. This again raises issues of exposure – wouldn’t it be a certain type of person answering these crowdsourcing calls from the economists? An unbiased conclusion can’t be drawn from datasets which have a certain type of people answering them. Therefore I think that unless you count national census’ and datasets retrieved from larger repositories as crowd sourcing (which I don’t think many will), the larger aspects of unbiased economics will have to stay away from crowd-sourcing initiatives at the moment.
I could certainly use Zooniverse for other work I may do; I could see it being useful for labour-intensive work like transcription, document tagging etc. If the content is engaging enough, then Zooniverse is a great tool for exercises like these. Overall I certainly see the benefits of crowdsourcing in general, and now believe Zooniverse to be a powerful tool for researchers.