idfg-bthomas's blog

What is an observation?

We define any identification of a taxa element at a point and place, be it a trailcam picture, animal mark recapture, waterbird survey or an incidental sighting as an observation.  Some are very complex including hundreds of attributes.  For some it may be as simple as an historical observation of Bison in Cassia County in the 1880s.

In order to be able to query all of what we collect regardless of survey and protocol differences it is necessary to standardize every sighting into a core set of fields.  We call this our Core Observation Entity.

Observation Core Entity

  • Species
    • Species Confidence (0-100% with labels for certain thresholds)
    • Count (Integer)
    • Count Type (absolute, estimate, minimum)
    • Sex (unknown, male, female, both, not applicable)
    • Life State (unknown, alive, dead)
    • Life Stage (immature, yearling, nymph, flowering, dormant….)
  • Date Time (from – to range with timezone)
    • Date Confidence (In seconds with labels for certain thresholds)
  • Location (geojson… lat/lng also accepted)
    • Location Confidence (In meters with labels for certain thresholds)
    • Location Resource (text, how location was derived… GPS, Google Maps Click, Lat/Lng Entered)  {may be dropped}
  • Observer(s)
    • Full Name (some part of First or Last name required)
    • Occupation
    • Employer
    • Phone
    • Email
    • Physical address
    • Background (Long Text)
    • Observer Confidence (0-100% with labels for certain thresholds)
  • Observation Methods (seen, heard, scat)
  • Reporter (username)
  • Report Date (timestamp)
The following are also auto-calcuated for all observations:
  • County
  • GMU
  • Region
  • HUC6 (proposed)
  • 5km Grid (proposed)
  • 6km Hex (proposed)

Common Optional Components

  • Site
  • Survey
  • Project
  • Field Methods (under construction)


You can see these core observations and their default settings and user interfaces on our bulk import form (login required) or the basic observations form

New Observations and Survey Upload Form

Do you collect lots of data?  Not about to sit down and upload your observations one by one?

We don't blame you!

Since our start, we've always encouraged our partners and collaborators to share their data with us, but there hasn't been an easy way to do so online. 

We've just changed that with our new Observations Import form.  While not fully automated, this form will allow you to attach your spreadsheets, databases or shapefiles while configuring some sensible defaults.  We'll take it from there and you'll be able to track the status of your import, follow a log of processing and answer questions about observation details we might come across.

You'll find the form here:

*Note:  If you have incidental and/or one-time survey data yet to be compiled into a spreadsheet, you also have the option of populating our MS Excel template which includes our core database values.  Click here to download a spreadsheet with built-in drop-down lists.  Click here to download a copy of the same spreadsheet, but without the built-in drop-down lists.


Creating observations using the Species API

Our Species Platform is designed not just to be a web page, but also an Application Programming Interface (API) that allows other websites and programs to read, create and edit observations.

Creating an Observation requires having an existing Fish and Game Account, logging in successfully, and finally creating the new observation.  For this example I'll be using the Firefox Poster extension.  Let's get started!

Step 1 - Login

First we need to authenticate against Fish and Game Accounts and receive a token to include in our future interactions with the REST API.  We can do both steps at once by posting to the login form with a return path set to receive our token:

Set the content type:


And send a standard html payload for username, password and rememberMe flag:



Post this and in the response you'll receive a string that we'll use as a token for all future requests.

Step 2 - Create Observation

Now we need to construct the request to create our observation.

Let's start by adding the token.

Add Token to Header

This token is only for session validation, not authentication.  Forms-based cookies are still needed for authentication, so if building a stand-alone iOS or Android application you'll also need to add the cookie to the request.


Now we need to build the payload for the observation.  We can do this as an HTML form, XML or JSON object.  For our example we'll use JSON.

Post to:

Content Type:


Finding the correct content for the post is a little tricky.  It involves quite a bit of familiarity with the data and our API which is built using Drupal Services.  The documentation will get you started, and examples are useful, but ultimately we are going to have to break this down in a future post line by line.  

In this example we have standard inputs (field_count) with one pattern, select lists and radios (field_count_type) with a slightly different pattern, dates (field_datetime) doing something different still, files (field_file) weirder still, geofield (field_location) with escaped geojson and node references (field_species) which include the primary key of the referenced content.  The good part, is once you understand these oddities, it's the same oddities to create other data stored in Species.


    "type": "observation",
    "field_count": {
        "und": [
                "value": "1"
    "field_count_type": {
        "und": [
    "field_life_stage": {
        "und": [
    "field_life_state": {
        "und": [
    "field_location": {
        "und": [
                "geom": "{ \"type\": \"Point\", \"coordinates\":  [-112.0, 43.5] }"
    "field_location_resource": {
        "und": [
                "value": "Google Maps Click"
    "field_location_use": {
        "und": [
    "field_observation_method": {
        "und": [
    "field_sex": {
        "und": [
    "field_species": {
        "und": [
                "nid": "[nid:80612]"
    "field_species_confidence": {
        "und": [
            "value": "100"
     "field_datetime": {
        "und": [
                "value": {
                   "date": "07/16/2013",
                   "time": "11:14am"
    "field_file": {
        "und": [
                "fid": ""
    "field_photo": {
        "und": [
                "fid": ""
If the post is successful you'll receive a json response with the new id of the observation:


You can browse to this location to retrieve an xml version of the new observation:

Append .json to view as a javascript object:

Or remove rest from the url to view and edit as HTML:

This is the first in a series of posts on creating, reading and manipulating content in species through the API.  We'll add links to future posts here of follow the speciesapi tag.

Introducing Ask Fish & Game - An online Q&A for F&G

Ask Fish & Game
is a new feature on the IDFG website, designed to assist with the questions



This new feature is designed to:

  • make answers easier for the public and our staff to find by establishing a central, searchable Q&A repository
  • increase the visibility of recent/popular questions by embedding questions contextually across our website (view a live example in the sidebar of the Wildlife Summit)
  • establish a more efficient, trackable workflow, ensuring that questions that are asked are answered

Video Screencast Overviews

Ask Fish & Game - An Overview

A review of the new questions and answers functionality added to the Idaho Fish & Game website. The "Ask Fish & Game" section allows anyone to ask a question of IDFG and the answer may be posted for others. Questions may be viewed by category or tag and these lists may be embedded on the IDFG website, or any website contextually. 


Ask Fish & Game - Management Interface

How to answer, assign and edit questions received at Ask Fish & Game. Designed for IDFG staff, this video reviews what to expect and how to use the management features.

Over the next few weeks we will be involving more IDFG Staff in assisting with answering questions.  When you do, you’ll receive an email with the title “[Ask Fish & Game] A Question has been assigned to you” with links to view and answer the question.

Launching a copy of a Drupal Website locally using Acquia Dev Desktop (DAMP)

The Acquia Developer Desktop enables you test a Drupal installation in a completely virtualized environment.  It is great for building and troubleshooting new features before deploying.  You will need a copy of a Drupal codebase and a SQL Dump File for the corresponding database to follow these steps.

  1. Download and Install Acquia Dev Desktop
    Full installation instructions - works on Windows and Mac
  2. Copy an existing Drupal Codebase
    Extract the codebase to a folder on your machine. 
    Default location: C:\Users\{username}\Sites\{drupal installation name}
  3. Install Database and Reference Website
    Open Acquia Dev Desktop and choose More... from the sites dropdown

    In the Settings dialog that opens, Click Import...

    In the Import site dialog, Point the Site path to your codebase, point the Dump file to the SQL and provide a new database name and site name.
    Click Import and wait a few seconds to a few minutes depending on the size of the SQL.
  4. Launch your copy of the website
    On the Acquia Dev Desktop, hit the dropdown below Go to my site and select your site.  Click Go to my Site.

Upload Limits in Drupal/IIS

Modifying your php.ini settings to the following will allow increased file upload limits:

  • upload_max_filesize = 50M
  • post_max_size = 50M

However, if you are using Internet Information Server (IIS), also check that there isn't a requestLimit set in the web.config. Drupal will show that a larger size is allowed via php.ini, but large files won't upload.

Here is the setting to change, on my system it defaulted to 10mbs:

<?xml version="1.0" encoding="UTF-8"?>
            <requestLimits maxAllowedContentLength="10485760" />

Changed maxAllowedContentLength to 52428800 and 50mb uploads worked immediately.

Calculating Extents in the Field Calculator

Here's a little shortcut.  You can calculate extents in the Field Calculator dialog using:

dim Output as double
dim pGeom as IGeometry

set pGeom = [shape]
Output = pGeom.Envelope.XMin

Choose advanced, paste what is above and set your field equal to Output. 

NOTE: If you want to add the values in Decimal Degrees or Web Mercator there's an app for that.  Ask Tim Williams how to install his Calculate Extent button.

Serving pptx, docx and xlsx on Apache webservers to IE 6, 7 & 8

Apache Web Servers running on certain versions of Linux may serve the new Microsoft Document Formats incorrectly as zip files in IE6, 7 & 8.  

Chrome, Firefox and Internet Explorer 9 (even when running in IE7 and IE8 testing modes) all will have no problem downloading files.

The solution is to add the following to your .htaccess file to correct the Mime Types:

AddType application/ .docm AddType application/vnd.openxmlformats-officedocument.wordprocessingml.document docx AddType application/vnd.openxmlformats-officedocument.wordprocessingml.template dotx AddType application/ potm AddType application/vnd.openxmlformats-officedocument.presentationml.template potx AddType application/ ppam AddType application/ ppsm AddType application/vnd.openxmlformats-officedocument.presentationml.slideshow ppsx AddType application/ pptm AddType application/vnd.openxmlformats-officedocument.presentationml.presentation pptx AddType application/ xlam AddType application/ xlsb AddType application/ xlsm AddType application/vnd.openxmlformats-officedocument.spreadsheetml.sheet xlsx AddType application/ xltm AddType application/vnd.openxmlformats-officedocument.spreadsheetml.template xltx


Simple mySQL Backups on Windows

Quick and easy backups from mySQL on windows courtesy of

The article has everything you need, just archiving the main points here for posterity:

Create mysql-backup.bat:

@echo off
echo Running dump...
"C:\Program Files\MySQL\MySQL Server 5.1\bin\mysqldump" -u[username] -p[password] --result-file="D:\mySQLbackup\content-backup.%DATE:~0,3%.sql" [database]
echo Done!

Add a scheduled task to run nightly, script above creates one file for each day of the week.

Email your backup administrator to add D:\mySQLbackup to their tapes.

Air Conditioner Failure - Servers Down Oct. 3, 2011 6:40AM - 11:20AM MST (-0600 GMT)

Air conditioners in Idaho Fish and Game's Server Room failed late Sunday resulting in shutdown of the Fish and Game servers and network at 6:40AM on Monday October 3.

The network and all servers are now back online.  If you experience any issues with our applications please let us know.

We apologize for the outage and any inconvenience this may have caused. 

Funding is short. If you've come to rely on our services and would like to contribute towards a redundant system/cloud hosting to avoid outages like this in the future we're all ears.