Blogs

Windfarm: USGS Releases Interactive Map of U.S. Wind Turbines

From the USGS website: "To remedy the lack of information, the USGS created this publicly available national dataset and interactive mapping application of wind turbines.  This dataset is built with publicly available data, as well as searching for and identifying individual wind turbines using satellite imagery. The locations of all wind turbines, including the publicly available datasets, were visually verified with high-resolution remote imagery to within plus or minus 10 meters."

Screenshot of southern Idaho in the windFarm application:

For more information: http://www.usgs.gov/blogs/features/usgs_top_story/mapping-the-nations-wind-turbines/

windFarm Application: http://eerscmap.usgs.gov/windfarm/

Updated Idaho Topo Maps Available & Using the TerraGo Toolbar

The Idaho 2013/2014 series of US Topo maps have been completed. The latest series has improvements over the 2011 series including Forest service roads and trails, structures, improved NHD, PLSS, and more. The image base for the new maps is the 2013 NAIP. GeoPDF maps can be downloaded FREE of charge at the USGS Map Store.

A GeoPDF is "an extension to Adobe's PDF 1.3 and higher versions enabling GIS functionality within standard PDF files." For more information about GeoPDF's click here. In these GeoPDFs you can toggle the layers (geographic names, boundaries, transportation, hydrography, contours, imagery) on/off.  You can also use the Search tool to look for specific named places on the map,

You can also download the FREE TerraGO Toolbar to access geospatial maps and imagery, measure distance and area, display/find coordinates, add "GeoStamps", and capture GPS information. For more information about TerraGo and to watch Training Videos, visit their website http://www.terragotech.com/products/field-data-collection/terrago-toolbar.

Here is an example of what a GeoPDF with the TerraGo toolbar add-in.  You can toggle map layers on/off in the left-hand "Layers" table of contents.  The TerraGo toolbars can be enabled in through the main menu or by going to View > Extended > TerraGo GeoPDF or TerraGo GeoMark.

What is an observation?

We define any identification of a taxa element at a point and place, be it a trailcam picture, animal mark recapture, waterbird survey or an incidental sighting as an observation.  Some are very complex including hundreds of attributes.  For some it may be as simple as an historical observation of Bison in Cassia County in the 1880s.

In order to be able to query all of what we collect regardless of survey and protocol differences it is necessary to standardize every sighting into a core set of fields.  We call this our Core Observation Entity.

Observation Core Entity

  • Species
    • Species Confidence (0-100% with labels for certain thresholds)
    • Count (Integer)
    • Count Type (absolute, estimate, minimum)
    • Sex (unknown, male, female, both, not applicable)
    • Life State (unknown, alive, dead)
    • Life Stage (immature, yearling, nymph, flowering, dormant….)
  • Date Time (from – to range with timezone)
    • Date Confidence (In seconds with labels for certain thresholds)
  • Location (geojson… lat/lng also accepted)
    • Location Confidence (In meters with labels for certain thresholds)
    • Location Resource (text, how location was derived… GPS, Google Maps Click, Lat/Lng Entered)  {may be dropped}
  • Observer(s)
    • Full Name (some part of First or Last name required)
    • Occupation
    • Employer
    • Phone
    • Email
    • Physical address
    • Background (Long Text)
    • Observer Confidence (0-100% with labels for certain thresholds)
  • Observation Methods (seen, heard, scat)
  • Reporter (username)
  • Report Date (timestamp)
The following are also auto-calcuated for all observations:
  • County
  • GMU
  • Region
  • HUC6 (proposed)
  • 5km Grid (proposed)
  • 6km Hex (proposed)

Common Optional Components

  • Site
  • Survey
  • Project
  • Field Methods (under construction)

 

You can see these core observations and their default settings and user interfaces on our bulk import form https://fishandgame.idaho.gov/species/observations/import (login required) or the basic observations form https://fishandgame.idaho.gov/species/observations/add.

New Observations and Survey Upload Form

Do you collect lots of data?  Not about to sit down and upload your observations one by one?

We don't blame you!

Since our start, we've always encouraged our partners and collaborators to share their data with us, but there hasn't been an easy way to do so online. 

We've just changed that with our new Observations Import form.  While not fully automated, this form will allow you to attach your spreadsheets, databases or shapefiles while configuring some sensible defaults.  We'll take it from there and you'll be able to track the status of your import, follow a log of processing and answer questions about observation details we might come across.

You'll find the form here:

https://fishandgame.idaho.gov/species/observations/import

*Note:  If you have incidental and/or one-time survey data yet to be compiled into a spreadsheet, you also have the option of populationg our MS Excel template which includes our core database values.  Click here to download a spreadsheet with built-in drop-down lists.  Click here to download a copy of the same spreadsheet, but without the built-in drop-down lists.

 

Idaho State Ranks - in need of Bird Data

We are assessing the state ranks (SRank) for various species to include in the upcoming State Wildlife Action Plan. Listed below is a list of birds where we could use additional information.  Please feel free to submit your entire dataset if it includes multiple species beyond this list.  

please direct any questions to Suzin Romin, suzin.romin@idfg.idaho.gov or Nikki Wade nikki.wade@idfg.idaho.gov.

Submit your data here, https://fishandgame.idaho.gov/species/observations/import (login required)

If you have incidental and/or one-time survey data yet to be compiled into a spreadsheet, you also have the option of populating our MS Excel template which includes our core database values.  Click here to download a spreadsheet with bulit-in drop-down lists.  Click here to download a copy of the same spreadsheet, but without the built-in drop-down lists.

Common Name Sci Name SRank
Pygmy Nuthatch Sitta pygmaea S1
Pinyon Jay Gymnorhinus cyanocephalus S1
White-winged Crossbill Loxia leucoptera S1
Scott's Oriole Icterus parisorum S1B
Forster's Tern Sterna forsteri S1B
Great Egret Ardea alba S1B
Virginia's Warbler Vermivora virginiae S1B
Common Tern Sterna hirundo S1B
Black Swift Cypseloides niger S1B
Northern Mockingbird Mimus polyglottos S1B
Upland Sandpiper Bartramia longicauda S1B
Harlequin Duck Histrionicus histrionicus S1B
Blue Grosbeak Guiraca caerulea S1B
Black Tern Chlidonias niger S1B
Common Loon Gavia immer S1B,S2N
Trumpeter Swan Cygnus buccinator S1B,S2N
Bohemian Waxwing Bombycilla garrulus S1B,S3N
Eurasian Wigeon Anas penelope S1N
Juniper  Titmouse Baeolophus ridgwayi S2
Boreal Owl Aegolius funereus S2
White-headed Woodpecker Picoides albolarvatus S2
Three-toed Woodpecker Picoides dorsalis S2
Black-throated Sparrow Amphispiza bilineata S2B
Long-billed Curlew Numenius americanus S2B
Cattle Egret Bubulcus ibis S2B
Yellow-billed Cuckoo Coccyzus americanus S2B
Lesser Goldfinch Carduelis psaltria S2B
Grasshopper Sparrow Ammodramus savannarum S2B
Red-necked Grebe Podiceps grisegena S2B
Lark Bunting Calamospiza melanocorys S2B
Merlin Falco columbarius S2B,S2N
Hooded Merganser Lophodytes cucullatus S2B,S3N
Black-backed Woodpecker Picoides arcticus S3
Northern Saw-whet Owl Aegolius acadicus S3
Northern Pygmy-Owl Glaucidium gnoma S3
Loggerhead Shrike Lanius ludovicianus S3
Northern Goshawk Accipiter gentilis S3
Great Gray Owl Strix nebulosa S3
Black Rosy-finch Leucosticte atrata S3
Lesser Scaup Aythya affinis S3
Swainson's Hawk Buteo swainsoni S3B
Wilson's Phalarope Phalaropus tricolor S3B
Flammulated Owl Otus flammeolus S3B
Sage Thrasher Oreoscoptes montanus S3B
Gray Flycatcher Empidonax wrightii S3B
Gray-crowned Rosy-finch Leucosticte tephrocotis S4B,S3N
     
     
Species currently not ranked    
Pacific-slope Flycatcher Empidonax difficilis  
White-tailed Ptarmigan Lagopus leucura SNA
American Golden-Plover Pluvialis dominica SNA
Snowy Plover Charadrius alexandrinus SNA
Broad-winged Hawk Buteo platypterus SNA
American Oystercatcher Haematopus palliatus SNA
Gyrfalcon Falco rusticolus SNA
Yellow Rail Coturnicops noveboracensis SNA
Horned Grebe Podiceps auritus SNA
Green Heron Butorides virescens SNA
White-throated Sparrow Zonotrichia albicollis SNA
Golden-crowned Sparrow Zonotrichia atricapilla SNA
Brown Thrasher Toxostoma rufum SNA
Clay-colored Sparrow Spizella pallida SNA
Swamp Sparrow Melospiza georgiana SNA
Ovenbird Seiurus aurocapillus SNA
Curve-billed Thrasher Toxostoma curvirostre SNA
Hoary Redpoll Carduelis hornemanni SNA
Ancient Murrelet Synthliboramphus antiquus SNA
McCown's Longspur Calcarius mccownii SNA
Anna's Hummingbird Calypte anna SNR

 

Update the data source in an ArcMap project using python, arcpy.mapping

Need to update a path in your ArcMap project to reflect the new location?  Here's how...

There are several methods for updating data sources using arcpy.mapping, check out this help topic from ESRI.  The example listed below is just one of several methods.

 Attach the following python code to a script in the toolbox of the mxd you wish to update.  This will allow you to update the directory the data layer is referencing, but will NOT allow you to change the data layer source name itself.  To change the data layer source name use the 'replaceDataSource' method instead.

###################

import arcpy
mapdoc = arcpy.mapping.MapDocument("CURRENT")
mapdoc.findAndReplaceWorkspacePaths("<the directory you are changing from>" , "<the directory you are changing to>")
mapdoc.save()
del mapdoc

###################

TIPS / Notes:

--You could get fancy and set the 'directory you are changing from' and the 'directory you are changing to' to input arguments.  Then, the script could be used for resetting any project.

--If you don't know the directories your data are referencing you can insert print lyr.dataSource.  This will  return a list of data sources used in the mxd.

--If you know you have broken links in your mxd, here's how to get a list printed to the python shell window.

import arcpy
mapdoc = arcpy.mapping.MapDocument("CURRENT")
brokenlist = arcpy.mapping.ListBrokenDataSources(mapdoc)

      for lyr in brokenlist:      

      print lyr.name

del mapdoc

 

 

XY Coordinate Validation Rules for Idaho

The following extents are based upon Hydrologic Units Fourth Code (Hydro_HUC4) projected in NAD83 Idaho Tranverse Mercator meters.

 

UTM XY Coordinate Validation Rules for Idaho GIS data per HUC4 intersecting Idaho

Northing:

The value for the Northing field must be an integer between 1,000,000 and 9,999,999.

In Idaho, we can narrow that down further (if we want) to ?

 

Easting:

The value for the Easting field must be an integer between 100,000 and 999,999.

In Idaho, we can narrow that down further (if we want) to ?

 

Latitude and Longitude Coordinate Validation Rules for Idaho GIS data per HUC4 intersecting Idaho

Latitude:

North

49.154108

South

40.584933

 

Longitude:

West

-119.349171

East

-109.303506

 

ArcGIS Server, ArcGIS Online - Questions, Answers, and Resources

ArcGIS Server Services

  • Should we use fewer Services with more layers or more services with fewer layers?
    • From an application performance perspective.
    • From a server demand perspective.
      Is server demand currently an issue?
       
  • Should we use dedicated services with everything an application needs or general services that can be used by many applications or both?
     
  • How many services are too many and is that likely to be an actual problem for us in the foreseeable future?
     
  • Is there any advantage to separating the services that run our applications from services we are sharing?
     
  • What data are we, and should we be, sharing through services?
     
  • What can we do to increase the efficiency of our services?
     
  • What kind monitoring can we set up?
     
  • What testing can we do?
     
  • Should we move some of our data and services to the State server or ArcGIS Online?

Resources:
http://gis.stackexchange.com/questions/17486/lots-of-layers-in-one-or-multiple-services-and-why

http://www.techques.com/question/26-17030/How-many-layers-should-be-in-a-MapService---what-are-the-tradeoffs

http://resources.arcgis.com/content/enterprisegis/10.0/services_performance

 

ArcGIS Server Caching

  • How are our current caches set up?
     
  • What is being cached?
     
  • How will they be used?
     
  • What else do we want to cache?
     
  • For what layers do we want to build caches and for what do we cache dynamically?
     
  • Can we do a combination of predefined caching?
    i.e.  Build a cache to a certain scale then cache dynamically beyond that.  Or, cache the most likely layer combinations and dynamically cash the less likely combinations.
     
  • Does it make sense to cache small simple layers like IDFG Regions beyond the most general zoom levels?
     
  • How many and how large cached services can we support?

Resources:
http://help.arcgis.com/en/arcgisserver/10.0/help/arcgis_server_dotnet_he...
http://resources.arcgis.com/en/help/main/10.1/index.html#/Strategies_for...
 

ArcGIS Online

  • What do we have on there now?
     
  • Where do the data, services, application code actually reside?
     
  • What are we likely to use it for in the future?
     
  • Who will be the “Named Users”?
     
  • How many named users do we want to license?
     
  • Should we move some of our data and services to the State server or ArcGIS Online?
     
  • Do we want to upload data, create services, build caches, now while it’s free? Will they delete everything and make us start over.

Observations: Finding and Reviewing

To “View & Export Observations” that have been submitted and entered into the IDFG-IFWIS database, for Animals you can go to https://fishandgame.idaho.gov/species/observations/list, while for Plants you go to https://fishandgame.idaho.gov/species/observations/plants/list.  From this page you can begin filtering and querying the data to fit your need.  Filters currently included are by: Common Name, Scientific Name, Category [of species type], and Region.  An updated form will also include filters for Observer and Reporting Organization.  If you require a specific filter, that you believe will be useful by many, please let us know.

These filters should be fairly intuitive, but a helpful note is to remember to “Reset” the filter when you want to change your search criteria, otherwise, the filter will continue to narrow down on the search criteria that are already entered.  You may also increase the number of observations returned per page by changing the Items/page filter.

Creating observations using the Species API

Our Species Platform is designed not just to be a web page, but also an Application Programming Interface (API) that allows other websites and programs to read, create and edit observations.

Creating an Observation requires having an existing Fish and Game Account, logging in successfully, and finally creating the new observation.  For this example I'll be using the Firefox Poster extension.  Let's get started!

Step 1 - Login

First we need to authenticate against Fish and Game Accounts and receive a token to include in our future interactions with the REST API.  We can do both steps at once by posting to the login form with a return path set to receive our token:

https://fishandgame.idaho.gov/ifwis/accounts/user/login?returnUrl=/speci...

Set the content type:

application/x-www-form-urlencoded

And send a standard html payload for username, password and rememberMe flag:

username=myusername&password=mypassword&rememberMe=true

Login

Post this and in the response you'll receive a string that we'll use as a token for all future requests.

Step 2 - Create Observation

Now we need to construct the request to create our observation.

Let's start by adding the token.

Add Token to Header

This token is only for session validation, not authentication.  Forms-based cookies are still needed for authentication, so if building a stand-alone iOS or Android application you'll also need to add the cookie to the request.

 

Now we need to build the payload for the observation.  We can do this as an HTML form, XML or JSON object.  For our example we'll use JSON.

Post to:

https://fishandgame.idaho.gov/species/rest/node.json

Content Type:

application/json

Finding the correct content for the post is a little tricky.  It involves quite a bit of familiarity with the data and our API which is built using Drupal Services.  The documentation will get you started, and examples are useful, but ultimately we are going to have to break this down in a future post line by line.  

In this example we have standard inputs (field_count) with one pattern, select lists and radios (field_count_type) with a slightly different pattern, dates (field_datetime) doing something different still, files (field_file) weirder still, geofield (field_location) with escaped geojson and node references (field_species) which include the primary key of the referenced content.  The good part, is once you understand these oddities, it's the same oddities to create other data stored in Species.

Content:

{
    "type": "observation",
    "field_count": {
        "und": [
            {
                "value": "1"
            }
        ]
    },
    "field_count_type": {
        "und": [
          "Absolute"
        ]
    },
    "field_life_stage": {
        "und": [
          "Mature"
        ]
    },
    "field_life_state": {
        "und": [
          "Alive"
        ]
    },
    "field_location": {
        "und": [
            {
                "geom": "{ \"type\": \"Point\", \"coordinates\":  [-112.0, 43.5] }"
            }
        ]
    },
    "field_location_resource": {
        "und": [
            {
                "value": "Google Maps Click"
            }
        ]
    },
    "field_location_use": {
        "und": [
          "Unknown"
        ]
    },
    "field_observation_method": {
        "und": [
          "Seen",
          "Heard"
        ]
    },
    "field_sex": {
        "und": [
          "Unknown"
        ]
    },
    "field_species": {
        "und": [
            {
                "nid": "[nid:80612]"
            }
        ]
    },
    "field_species_confidence": {
        "und": [
          {
            "value": "100"
          }
        ]
    },
     "field_datetime": {
        "und": [
            {
                "value": {
                   "date": "07/16/2013",
                   "time": "11:14am"
                }
            }
        ]
    },
    "field_file": {
        "und": [
            {
                "fid": ""
            }
        ]
    },
    "field_photo": {
        "und": [
            {
                "fid": ""
            }
        ]
    }
}
 
 
 
 
 
If the post is successful you'll receive a json response with the new id of the observation:

 

You can browse to this location to retrieve an xml version of the new observation:

Append .json to view as a javascript object:

Or remove rest from the url to view and edit as HTML:

This is the first in a series of posts on creating, reading and manipulating content in species through the API.  We'll add links to future posts here of follow the speciesapi tag.