Thursday, June 25, 2015

Geoprocessing with Python

Are we really half-way through the semester already? Indeed time is flying. But I can definitely say we are having way too much fun. Right? This was a very cool Module. Now we are learning how to use the power of Python (not Monty-though that would be cool too) to do some ArcGIS Geoprocessing. ArcGIS and Python are closely integrated. ArcPy modules, classes, and functions give acess to all the Geoprocessing tools in ArcGIS. It was fun using some of them for this Lab.

Our task was to write a script that performs three geoprocessing functions: Add XY coordinates to the hospitals.shp layer using the AddXY tool; Create a 1000 meter buffer around the hospital features using the Buffer tool; and Dissolve the hospital buffers into a separate, single feature using the Dissolve tool. Also, after each of the three tasks above were performed, we needed to print the messages from that tool, using the GetMessages() function. Here's the run down on how I accomplished this for this week's lab:


I began by reviewing my data in ArcCatalog.

1. I copied the data from the R-drive and extracted into my Module 6 folder.
2. I opened a blank map so I could access the Arcpy window and manipulate the data I was given.
3. I set the environment (workspace) to my Module6 Data folder and prepared to begin working on my code.
4. From doing the Exercise, I surmised that to create the script I would need something like the following code:
arcpy.Buffer_analysis("hospitals.shp","S:/GISProgramming/Module6/Results/hospitals_buffer.shp","1000 METERS","","","ALL")

5. I spent time adjusting the above and researching ArcGIS Help to ensure I had this part correct.
6. I set the Local Variables for in_data & in_features and ran this part of the code.
7. Finally, I was convinved that my code would do the actions of create a buffer, and dissolve the hospital buffers into a separate, single feature.
8. Next, I needed to use the AddXY tool to insert an X and Y column at the end of my attribute table for the hospital layer. I found the format for AddXY_management.
9. Since I had set the environment to my workspace ("S:/GISProgramming/Module6/Data"), I could Execute the AddXY tool with the following code: arcpy.AddXY_management(in_features)
10. The above steps gave me a script that would perform the three geoprocessing functions: Add XY coordinates to the hospitals.shp layer; Create a 1000 meter buffer around the hospital features using the Buffer tool; and Dissolve the hospital buffers into a separate, single feature.
11. However, the Lab called for a separate line of code for the Dissolve feature, so I had to look up the Dissolve tool to use this tool separately.
12. The Dissolve tool ArcGIS Help stated that: Dissolve (Data Management) Syntax=

Dissolve_management (in_features, out_feature_class, {dissolve_field}, {statistics_fields}, {multi_part}, {unsplit_lines})

13. This was a tremendous help to me. I used the syntax from above to create my code.

The below code dissolves the hospital buffers into a separate, single feature:

fc = arcpy.Dissolve_management("S:/GISProgramming/Module6/Results/hospitals_buffer.shp","S:/GISProgramming/Module6/Results/hospitals_dissolved.shp")#, "", "","ALL")

14. By this time I had all three Geoprocessing functions working correctly, however, I did not have the code for printing the messages.
15. The last part for me: Find the code example for printing messages.
16. This felt like the longest part of this lab for me. I did my research and eventually found the example code at: http://resources.arcgis.com/en/help/main/10.2/index.html#//03q30000006s000000

GetMessage (arcpy)
import arcpy
fc = arcpy.GetParameterAsText(0)
arcpy.GetCount_management(fc)
# Print the first and last message returned by the last
#  tool executed (GetCount)
message_count = arcpy.GetMessageCount()
print(arcpy.GetMessage(0))
print(arcpy.GetMessage(message_count - 1))

17. The above was to print the first and last message of the tool. I modified the above code and after a while, I had:
# Prints Geoprocessing message
fc = arcpy.GetParameterAsText(1)
message_count = arcpy.GetMessageCount()
print(arcpy.GetMessage(0))

I added the above code after each Geoprocessing function and ran it a few times to make sure it worked properly. When I was satisfied my code was working properly, I saved my script as Mod6_gilcastillo.py and exited PythonWin. Here's my results:



Wednesday, June 24, 2015

Homeland Security - Prepare MEDS









This week we began our study of the Department of Homeland Security (DHS). Our task is to accomplish a Pepare MEDS exercise. MEDS stands for Minimum Essential Datasets and the idea for this was 
initiated by the DHS to ensure the Nation is prepared for any domestic hazard.

The National Preparedness Goal is a result of Homeland Security Presidential Directive-8 (HSPD-8) and is designed “to achieve and sustain risk-based target levels of capability to prevent, protect against, respond to, and recover from major events, and to minimize their impact on lives, property, and the economy through systematic and prioritized efforts by Federal, State, local and Tribal entities, their private and nongovernmental partners, and the general public.”  This information and more is available at:   National Preparedness Guidelines

The idea is that GIS can be very helpful in the event of a Natural or Man-made disaster and having the right types of information can save lives.  The datasets we are talking about are:

DHS Minimum Essential Data Sets (MEDS)
-Orthoimagery              -Boundaries
-Elevation                       -Structures
-Hydrography               -Land Cover
                -Transportation       -Geographic Names          

DHS is a huge Federal Agency composed of a number of  Agencies:




















An individual's safety has always depended on how well that person is prepared. It is no different when we are discussing municipalities. As we discuss GIS information, we are concerned about:  
1. The quality, accuracy and currency of all types of data about a particular location; 
2. Efficient and effective methods to optimize data sharing and interoperability between agencies and jurisdictions; 
3. Geospatial analysis to provide situational awareness at all stages of a homeland security operation.  
Therefore, DHS has promulgated guidelines that a comprehensive geospatial database be prepared, ready, available, and accessible to a community’s needs for the prevention, preparation, response, and recovery relating to any catastrophic event.  This is one of the most important components of a successful homeland security operation.

Our task is to compile a MEDS for the Boston area prior to the running of the Marathon - yes, we are traveling back in time. We are tasked to assemble the data to Prepare our map for use in a Homeland Security planning and operations analysis. I collected and re-named the layers indicated above (DHS MEDS) and ensured the projections were consistent. DHS requires that the MEDS data be in the North America Datum of 1983. 

There was a lot going on in this MEDS Prepare exercise, and while I don't want to leave anything out, I'll just hit the highlights. I started the project by preparing my environment and setting the default Geodatabase to my BostonData.gdb. This is a good way to start any project and a habit I am glad to be acquiring. After adding data to my map, I needed to Join a table to my Transportation layer. Working with Tabular data is interesting because I normally think of GIS as strictly images that comprise my map. But the tabular data can be highly useful in describing or grouping data to make the presentation more meaningful.  I took the  Census Feature Class Code (CFCC) table and Joined it to my Transportation layer to better describe the types of roads in the Boston area and then group many of those roads together to streamline my legend. Another "very cool" task here was that I used the Scale Range to hide or not display certain labels at Small Scales so that when you are "zoomed out" the map doesn't appear cluttered. Then, when you "zoom in" or select a much Larger Scale, say Street Level, the labels magically appear. I thought this was brilliant.      

Another interesting task was using the Extract by Mask Tool. This tool is found at: Spatial Analyst Tools > Extraction > Extract by Mask.  Essentially, you use one layer (extent) to extract features from another layer. I also learned that to create a color map, the input raster dataset must be a single band raster with integer values and a pixel depth of 16-bit unsigned or fewer.

One last area that was interesting to me was, converting the schema.ini file that contained all the GNIS information. The Geographic Names Information System (GNIS), was developed by the U.S. Geological Survey in corporation with the U.S. Board on Geographic Names.  GNIS data contains information about physical and cultural geographic features in the U.S. and associated areas, both current and historical (not including roads and highways). The database holds the federally recognized names of each feature and defines the location of the feature by state, county, USGS topographic map and geographic coordinates. The GNIS is the official vehicle for geographic names and it is used by the Federal Government as the source for applying geographic names to Federal maps & other printed and electronic products. The GNIS is also used to provide names data to government agencies and to the public, and provide the Geographic Names data layers to The National Map. In other words, GNIS is a very important system in the GIS world. The problem with the schema.ini file was that the information was not in rows and columns that made sense. When I opened the schema.ini file the first time I could not recognize headings, names, etc. the way it was formatted. So, I had to change the format from CSV Delimited to Delimited and add a parenthetical "pipe." This is the how I made the change in Notepad:  Format=Delimited(|).  The symbol in the parenthesis is a "pipe"; this character is found on the backslash key and to access it you use  "shift" , and then hit the backslash key.      

But that's not all folks. For some reason, after I made the change to the .ini file, my data still wasn't neatly formatted as it should have been. I tried closing and opening the file a few times and finally had to exit my remote session (disconnect from the remote server) and walk away for about 30 minutes. On a positive note, I had time to grab another cup of coffee and stretch my legs. When I returned and re-connected to the remote server, "abracadabra" the file opened in perfect form. This allowed me to use ArcCatalog, to right-click MA_FEATURES_20130404.txt and select the Create Feature Class and From XY Table to create my Geographic Names layer. 

One final note. I also took a look at the  USGS: The National Map  website to get a better idea of what is involved if I had to obtain all the MEDS information (the 7 datasets listed above) completely on my own. Wow! That would indeed be quite a time consuming task. I downloaded some of the data, just for fun and it took me quite a while to get to the data; select the appropriate data; request the data be sent to my email account; then download and extract the files on my computer. My map is not yet complete as we will continue to work on the MEDS process next week. But, here's a look at my MEDS layers as depicted in the Table of Contents (TOC). Remember, this is not yet complete, so please don't judge me.
























Thursday, June 18, 2015

GIS5103 Programming - Geoprocessing in ArcGIS

Here we are in week five-- time is flying fast. Learning the ins-and-outs of Python has been a challenge, but it has also been fun. This week we learned how to:


- Create a toolbox

- Create a tool using ModelBuilder

- Set Model Parameters

- Export a script from ModelBuilder

- Update a model-derived script to work outside of ArcMap

- Create a script tool


The Model, Script and Tool were all created to Clip the Soils layer to the Basin layer, then Select from the Soils layer attribute table "Not Prime Farmland." And finally, Erase the soil selection from the basin polygon. Working with ModelBuilder was quite enjoyable though exporting my Model as a Script left me with more work than I had anticipated. ModelBuilder does a great job of visually presenting you an overall picture of the task or tasks you are attempting to accomplish. 

A few notes on my process for creating my SoilErase model
1.       Before creating the model, I checked that “Overwrite the outputs of geoprocessing operations” was selected.
2.       Next, I created a toolbox
o   ArcCatalog > Toolboxes > My Toolboxes > New > Toolbox
3.       To create the model, I selected from my newly created toolbox, New> Model and named it SoilErase
o   I added the Basin and Soils shapefiles by dragging and dropping directly to the blank model.
o   I selected the Clip tool from the toolbox and dragged-dropped to the model canvas as well.
o   I used the connector tool to make the Soils shapefile the input and the Basin shapefile the clip feature

4.       Once I had the beginning of the model I ran it a few times to ensure it was going to execute.         
o   I had difficulties getting the Erase portion to work properly and I actually created three models.
o   My third model executed correctly and places new layers on my map:
§  soils_Clip.shp
§  soils_Clip_Select.shp
§  soils_Clip_Select_Erase.shp

5.  The model executed correctly as far as I could tell. Working on the python script was another matter. I had to modify the script, as outlined in the Lab instructions, each time I re-created the model. I also tried using debugging to find if I had other errors.  


So, when you export the model to Python, the code may or may not be nearly complete, so this is the real detective aspect of "Modelbuilder-to-Python code" and this is why I had to re-create my model several times. I built my Model using the Clip, Select and Erase tools, multiple times to determine where my errors were. I reset the Environment, spelled-out the file pathways, and added a line of code to defeat the "Output Feature Class already exits" error. The line of code to resolve this error is:

                                                       arcpy.env.overwriteOutput = True  

By the time I was done I could run the script in ArcGIS successfully and see the layers added to my map. When I ran the script as a tool outside of ArcGIS I was not sure of the results. That is I could not verify the new layers were added to my map even though they existed in my file folder.  Here's what my final output looked like:


Wednesday, June 17, 2015

Homeland Security- DC Crime Mapping

I can't believe we are on week 5! This week, we are looking at how to analyze data using the Kernel Density Tool. The subject of this week's Lab is to analyze DC Crime data by examining various categories of crime ad mapping the proximity of crimes by type and distance from various Police Stations.

I created a graph that shows all the reported crimes in the DC area:


To create this graph, my first step was to summarize the data and create a database file (dbf).
                 - Open the Attribute table, right click the column "Offense" and select Summarize
After I created the graph I experimented with different settings and selections.
                 - I decided to use the “palette” color and named the other items accordingly.
Analyzing this graph was a quick way to see how many of each type of offense was reported.
                 - I could clearly see that the top three Offenses were Theft, Theft from Auto and Burglary.

But this was only one step in the entire process to create the two maps for this week's Lab. Creating the first map, I also had the chance to use the Ring Buffer Tool. I created rings at distances of 0.5 miles, 1.0 miles and 2.0 miles. This told me how many total crimes were reported within these distances. The results were:

                                             0.5 Miles there were 668 crimes reported
                                             1.0 Miles there were 856 crimes reported
                                             2.0 Miles there were 556 crimes reported


I then used the formula:       ([Count_] / 2080) *100
To compute the percentage for each category of crime reported. 

                          At 0.5 miles the percentage of total crimes reported was: 32%
                          At 1.0 miles = 41%
                          At 2.0 miles = 27%

I looked at the data using other tools and finally created the following map:




What I learned from my analysis is that the highest crime rate is at three separate District HQs: 3D with 13% of total crime; 7D with 12% of total crime and 6D with 10%.  In fact, 6 of the top 7 crime rates are at District HQs with the only exception being the Asian Liaison unit that has 8% of the total crime. The data seems to indicate that more crime occurs in the vicinity of HQs and less crime occurs where there are Liaison units or smaller Substations.  However, more important to me is that I gained valuable experience doing this analysis using the tools I have studied in this course.

Compiling the second map was very interesting because I learned to use a knew tool: Kernel Density Tool (ArcToolbox > Spatial Analyst Tools > Density > Kernel Density). I won't try to explain the math that the tool uses; suffice it to say, this is an awesome tool and I had a blast using it. Here's the ArcGIS Help link that explains the Kernel Density Tool much better than I ever could:  
                
                                         The Awesome Kernel Density Tool



And here's my second map for this week's lab:

























Friday, June 12, 2015

Natural Hazards: Hurricane Sandy 2012

This week we continued our study of natural hazards by compiling two maps: One that depicts the path of Hurricane Sandy and another that is a Damage Assessment of the same Hurricane. The below excerpt is from, "Tropical Cyclone Report Hurricane Sandy (AL182012) 22-29 October 2012":


"Hurricane Sandy was a classic late-season hurricane in the southwestern Caribbean Sea.  The cyclone made landfall as a category 1 hurricane (on the Saffir-Simpson Hurricane Wind Scale) in Jamaica, and as a 100-kt category 3 hurricane in eastern Cuba before quickly weakening to a category 1 hurricane while moving through the central and northwestern Bahamas...  The system restrengthened into a hurricane while it moved northeastward, parallel to the coast of the southeastern United States, and reached a secondary peak intensity of 85 kt while it turned northwestward toward the mid-Atlantic states.  Sandy weakened somewhat and then made landfall as a post-tropical cyclone near Brigantine, New Jersey with 70-kt maximum sustained winds.  Because of its tremendous size, however, Sandy drove a catastrophic storm surge into the New Jersey and New York coastlines.   Preliminary U.S. damage estimates are near $50 billion, making Sandy the second-costliest cyclone to hit the United States since 1900."

Hurricane Sandy was certainly a major Natural Hazard event in 2012, and it makes sense for us to study this event today. Our objectives for this week's lab were to utilize several tools to create and display the storm path and conduct a damage assessment in the area where the storm made landfall. Some of the tools I used were:

- Select by Attribute
- Add X-Y data
- Points to Line Tool
- Adding Graticules


Using the Points to line tool was interesting. The tool can be found at:

ArcToolbox > Data Management Tools > Features > Points to Line Tool

I took the points provided with the Sandy Track Events file and utilized the Points to Line tool to create the path that Sandy followed up to the New Jersey shore. Then, I symbolized the storm intensity using an appropriate color ramp and then I added the storm symbol to my map. Another interesting thing I did was add Graticules to my map. This was indeed a first for me, and fortunately, a relatively straight forward task to accomplish:

From Data Frame Properties > Grids > New Grid.  Select Graticule: divides map by meridian and parallels and, there you have it. Of course, you must be in the Layout view to see the Graticules (another Lesson Learned).

To create the Damage Assessment map, I used several aerial images to make a Raster Mosaic.

I added the imagery to DamageAssessment.gdb by right-clicking on the GDB > New > Mosaic Dataset. After adding all the layers such as the NJ_Counties, NJ_Municipalities, NJ_State and NJ_Roads, it was time to create some data!

One of the main objectives for this week's lab was to make comparisons of the before and after aerial imagery where Sandy came ashore in New Jersey. The Effects toolbar contains a "Swipe" tool that allows you to peel the selected layer back and see underneath it. Cool! Only, you have to be in Data View to use the tool...I was in Layout view when I started this part and the tool was disabled; another good lesson learned.

I created the Damage Assessment Table after digitizing several homes from the post storm aerial and I did remember to save my edits!  Here's a look at my final two maps.


































Thursday, June 11, 2015

GIS Programming- Debugging & Error Handling

This week in GIS5103- Programming, we worked on three Python scripts using Debugging and Error Handling techniques. The main focus was learning to use the Debugging tool in PyhthonWin. I found the "step" function to be extremely useful. Additionally, using the "step over" function prevents the internal libraries from popping-up when an error is detected.

step over button 
 This method for checking your script is great because the Debugging tool will indicate your error and display an error code that can be very useful in finding the solution to get your code to run from start to finish.  We examined three scripts this week. Script 1 was intended to print out the names of all fields on the Module4/Data/parks.shp attribute table, but there were two errors/exceptions I had to find first. Using the Debugging tools from above, I was able to correct the errors and successfully run the script:



The second script was supposed to print out the names of all layers in each data frame of the Module4/Data/TravisCountyAustinTx.mxd map, but it too had eight errors/exceptions. Using the tools above and going line-by-line, I found all eight errors and was able to run the script:


The third and final script had a Part A and Part B. Part A had an Exception we were not to fix. Instead we would use the try-except method.  The script was intended to  run Part A; encounter an error; and print an error statement; Part B would run successfully, printing out the Name, Spatial reference, and Scale of the data frame. 

There were many steps I used to complete script 3. Below are the steps I followed to apply the try-except statement:

1.  Initially, I ran my code normally, not using the Debugger tool, to see what error was present.
2.  After I detected which line of code had an error, I reviewed the try-except portion of the lecture video.  
o   I was concerned with the placement of the try- except block of code.
o   After reviewing both the video and the text book, I had a good idea of where to place my try-except code
   
3.  Initially, I placed the block of code before and after line 13 (mapdoc = arcpy.mapping.MapDocument() )
4.  My challenge was to determine the, beginning and ending points for the try-except block of code, so I used a line-by-line approach until I got my script to run correctly.


My lesson learned was to take note of the blocks of code as each step or function relates to its placement in the PythonWin window. I found it helpful to add empty lines between each action in the script as I used the step-over function. 




















































Friday, June 5, 2015

Python Fundamentals II - A Dice Rolling Game

We continue our journey through the world of Python. This week we picked-up where we left off last week with Python Fundamentals. Learning a new language takes time and practice. I put in quite a bit of time this week and I now have a better understanding about Python and functions, and methods, and syntax, but I still have a long road to travel.

For this week in GIS 5103 Programming, I created a scrip that runs a dice rolling game based on the players’ name length, then it creates a list with 20 random numbers in it. It also has a conditional statement with a nested while loop that removes all instances of an “unlucky” number (I selected 6) from the list. Below I explain one of the steps in my script-writing process.

Removing numbers from my list.
1.      To remove a number from a list, the command is: number.remove(specify the number)
2.      For my script, I created a “list” so my remove command looked like: list.remove(specify the number)
3.      I previously defined my “unlucky” number as the variable “k”, so I wanted “k” removed from the list: list.remove(k).
4.      However, the command “remove” only removes the first occurrence. Therefore, I had to define a loop
5.      To define the loop, I used: 
q = 1
while q <=20
k=6
list.remove(k)

Notes on my progress:
1.      I had numerous issues with this script: I created several infinite loops (Remember to use “break” or define that loop with the “sentry” on top)
2.      My script disappeared from my S-drive for unknown reasons: I saved my script on my home desktop and could open it using Notepad. This must become first nature for me

3.      There are many ways to write a script; I have a lot to learn



Tuesday, June 2, 2015

GeoHazards: Tsunami Evacuation

This week in GIS4048 Applications, we continue our journey through Hazardous terrain. This week we are discussing Tsunami's. Specifically, we were charged to build an evacuation map for the Fukushima II Nulcear Power Plant that was flooded by the Tsunami that hit Japan on March 11, 2011. This Tsunami, that hit the northeastern coastline of Japan, was the product of a 9.0 earthquake.

A Tsunami is an incredibly destructive force that:

- „Causes human death (average death rates are estimated at 50% of the population but have been reported up to 80%.)
„- Leaves people homeless, destroys other property.
„- Psychologically traumatic.
„- Spreads contaminates to water and food sources.

Our objective is to provide an analysis of both the Radiation hazard and the runup evacuation zones caused by this Tsunami. There were many tasks to accomplish. Below I outline two major tasks.

Task#1 Fukushima Radiation Exposure Zones
To create the Fukushima Radiation Exposure Zones, I had to develop my map so that I could look at the distances to determine the possible population totals that might be effected by the nuclear radiation from Fukushima II. To accomplish this I needed several layers of data.  I included the following layers:

a. JpnBndOutline
                                    b. NuclearPwrPlnt
c. JapanCities
d. Roads
e. NEJapanPrefectures

With these layers, I could find Fukushima II by manually selecting from the NuclearPwrPlnt attribute table the plant I was concerned with. From this selection I made a layer with the one point: Fukushima II Nuclear Power Plant.

At this point I needed Population information; I added Japan Cities and selected those Cities that were within 50 miles of my NucPwrPlnt.

Next, I could use the multiple rings tool to place rings at: 3, 7, 15, 30, 40 and 50 miles from the Fukushima II, Nuclear Power Plant.

ArcToolBox>Analysis Tools > Proximity > Multiple Ring Buffer

 After this, I found, by using Select by location the cities that fell within the prescribed distances above 

The last step for me was to ensure I had selected an appropriate symbology for the radiation zones
.
                        - The first attempt I forgot to deselect <all other values>
                        - Subsequently, I reviewed several color ramps before selecting the red to green ramp.
                        - I selected Category= Unique value and then deselected the borders for each color

Here is my result:




Task#2 Model Builder: Develop Tsunami Runup Zones 

This was a very interesting portion of this week's lab. Model Builder is a great tool; you just have to stay organized and save your model when done. To get to the Model Builder you open ArcCatalog then add a ToolBox to your GeoDataBase (right click on the .gdb and select New > Toolbox). Once you have added the Toolbox, you can right click on it and select New > Model  

After I completed and successfully ran my model, I went back to it to do some editing, however, I could not open it. I must not have saved it properly (though I thought I had because it appeared in ArcCatalog). So, I re-accomplished the model, getting two-times the practice on this part of the lab, and then I ran the model and SAVED my work.  
Lesson Learned: Ensure my work is always SAVED.