Thursday, July 30, 2015

Creating Custom Tools

We have made it to Module 10! This is a good feeling. This week, we worked on writing code to create a Tool and then share the Tool for anyone to use. There are a number of advantages to using a script tool versus a stand-alone script. For instance:

1. A tool is an integral part of geoprocessing. It is possible to access the tool from Arc Catalog and the Search Window in ArcMap. It is also possible to use the tool in ModelBuilder and in the Python window to call it from another script.
2. A tool is fully integrated with the application it was called from. This means any environment settings are passed from the application, such as ArcMap, to the tool.
3. A tool includes a tool dialog box.
4. The use of tools makes it possible to write messages to the "Results" window.
5. Documentation can be provided for tools, which can be accessed like the the documentation for systems tools.
6. Sharing a tool makes it easier to share the functionality of a script with others.
7. A well-designed tool means a user requires no knowledge of Python to use the tool's functionality.

I made a tool, from a stand-alone script, that would clip multiple shape files. This was really cool, when I finally got it to work properly. Here's what the Script tool window looks like:





























I worked on my script using the "trial and error" method along with debugging.  But, initially I was focused on the wrong area. My script did not look like it was running properly, but that was because it had code I had placed in it to run as a "Tool" not a Stand-Alone script. So the errors I received when I ran the stand-alone script, were not errors when I ran the Tool from the ToolBox. Below is a quick overview of how to create a tool. 


1.       To start the process of turning a stand-alone script into a tool, you must create a toolbox. You can do so using ArcCatalog.
o   Right click on the folder where you want to create the toolbox and tool. 
o   Select “New”, then “Toolbox.”

2.       Once you have a new toolbox, you can right click on it and select “Add” and then select “Script.”

3.       After you select “Script”, an “Add Script” template pops up for you to fill in.
o   Add, “Name”, “Label “and a “Description”. Then, add the script for your tool.


4.       At this point, you are on the last page of the Add Script template. Here you will add the parameters:
o   Click on the first line under “Display Name” and type the name of your first input parameter. Under “Data Type”, select the appropriate type from the pull down menu.
o   Do the same for as many parameters as you require; include an output location.
o   Under, “Parameter Properties”, select the appropriate values.
o   For “Direction” it can be “Input” since, when you create the script you can define where to save the “results”
o   Save when done.

5.       You can edit the tool by right clicking and selecting “Edit” or you can use another editor such as PythonWin.

One of the more interesting steps in creating a Tool, and where I learned my Lesson from, was: 

 To make a script into a tool, you must not have hard-coded file locations

o   File locations in the script would normally look like: “S:\GISProgramming\Module10\Data”
o   For the tool to work properly, the file locations should be in the form: arcpy.GetParameter(#), where (#) is the line number of the entry on the tool template.
o   This includes defining the workspace.

For me, the Big Lesson I learned about writing a script for a tool was this:

6.       The script may not run properly as a stand-alone script once you have made the changes (see above) so that the script will run properly as a Tool.  This is because the file paths are defined in the tool template and not explicit as they would be in a stand-alone script.

I spent a great deal of time adjusting my script going back and forth changing lines to get my script to run without an error. What I needed to do was make a single change and then run the "Tool" to check my script. As I said, this was a good Lesson Learned.  Here's what my final "Tool" looked like when it ran properly:


Friday, July 24, 2015

GIS5103 - Working with Rasters

Here we are finishing up with week 9. I can't believe we are this close to the completion of GIS 5103, Python Programming!

For this week's lab we worked with several Geoprocessing tools to manipulate a raster output to create areas with particular parameters for, slope & aspect. We created a script that would create a single raster image to highlight areas with the following characteristics:

- Forest landcover (classifications 41, 42 & 43)

- Slope between 5–20°

- Aspect between 150–270°


I took the data we had and wrote the basic parts to the script. I was able to import arcpy, and set the environment and enable overwrite. My challenge started when I had to create new rasters using the Spatial Analyst (arcpy.sa) module (ArcPy SA). After importing the Spatial Analyst module (from arcpy.sa import *) I had all the tools I needed to complete this lab, but somewhere I went left when I should have gone right, or something like that.

I established the code for the conditional loop to "CheckOutExtension("spatial") so that I could move on to create temporary rasters and then combine them to make the raster as defined by:


- Slope less than 20 

- Slope greater than 5 

- Aspect less than 270 

- Aspect greater than 150
I had errors such as:

Traceback (most recent call last):
  File "S:\GISProgramming\Module9\Scripts\Mod9_gilcastillo.py", line 47, in <module>
    outraster.save("S:/GISProgramming/Module9/Data/Elevation.gdb/slope_per") #Results/slope_per")
RuntimeError: ERROR 010240: Could not save raster dataset to S:\GISProgramming\Module9\Data\Elevation.gdb\Slope_elevat4 with output format FGDBR.

And,

File "C:\GIS\ArcGIS\Desktop10.2\arcpy\arcpy\geoprocessing\_base.py", line 498, in <lambda>
    return lambda *args: val(*gp_fixargs(args, True))
ExecuteError: Failed to execute. Parameters are not valid.
ERROR 000860: Input raster: is not the type of Composite Geodataset, or does not exist.
Failed to execute (Slope).

Colorful, yes; helpful, not really. I tried several changes (sometimes many at the same time, but more often, small changes one at a time to format, and syntax using Debugger). One of the last changes I tried was to create a new "gdb".  But not even this helped me correct the errors in my script so that it would progress to the end.  Here's what my script finally produced:  



Monday, July 20, 2015

SunPy: Python for Solar Physicists

The article I found was “SunPy: Python for Solar Physicists”. You ask why Solar Physicists? Well, Physics to me is an incredibly challenging field and anything that can help to make research and discovery easier in this field is a very good thing. The article shows how SunPy, can help Physicists with their analysis of solar data. The article also shows how a Python code can help take huge quantities of data collected via solar satellites, and make the analysis of this data possible. As an example, “NASA’s Solar Dynamics Observatory (SDO) satellite records over 1 TB of data per day all of which is telemetered to the ground and available for analysis”: That’s data from only one satellite!  But how do you take all the disparate data including different wavelengths, spatial scales, and those with high time cadence, and crunch all this data together? It seems, “SunPy is the data analysis toolkit which provides the necessary software for analyzing solar and heliospheric datasets in Python.”  SunPy is a free and open-source code and this too is one of the positive aspects of complex code for complex tasks. This article was presented at the SciPy 2013 and here’s the link to the presentation:

The Python code allows you to analyze and then animate solar data.  SunPy wants to provide a solid base of data types and a framework around which people can do science and I believe SunPy has succeeded. This Python code allows you to conduct awesome science using a modest amount of Python code.


Friday, July 17, 2015

GIS Programming – Working with Geometries

This week in Python, we are learning about using Python to manipulate spatial data in ArcMap. In our Exercise for week 8 we worked with geometry objects and multipart features. One of the interesting items I worked with was Tokens. Tokens are shortcuts that provide you a way to access specific geometry properties. Such as:

import arcpy 
from arcpy import env 
env.workspace = "S:/GISProgramming/Module8/Data" fc = "dams.shp" 
cursor = arcpy.da.SearchCursor(fc, ["SHAPE@XY"]) 
for row in cursor:     
    x, y = row[0]     
    print("{0}, {1}".format(x, y))

Where "SHAPE@XY" is a Token that will return a tuple of X, Y coordinates representing the centroid of the feature. There are many other Tokens I had the chance to use in this week's Exercise and Lab. The goal of the Lab was to write a script that creates a .txt file and then write to it the coordinates and Object IDs for vertices in a shapefile. The first two parts of the Lab went very smoothly for me but I ran into some difficulties trying to get the complete data to write to the text file and print out properly.  I think my challenge began with the "for loop" attempting to use the .getPart command to obtain the point data. My loop should have looked something like the example in our Assignment:

for row in cursor:     

    for point in row[1].getPart(0):         
        print point

However, it did not quite work for me. Below is a screen shot of my results. I'll continue to work on this and perhaps by midnight or some day in the not too distant future I can get this to run correctly.



Wednesday, July 15, 2015

Participation Exercise Part 2- Mapping Assessment Values

This is the second part of this week's Participation Exercise. We continue our study of the duties and responsibilities of the Property Appraiser's Office. This time, I am working on determining if there are any anomalies with the appraised values of homes in West Ridge Place, a subdivision in Pensacola, Florida. This is a great assignment and I am happy to have this work.

As they say, "A picture is worth a thousand words." So, I compiled a map that had the data I could use to make a recommendation to the local Property Appraiser. My goal was to display values for the homes in the West Ridge Place subdivision in a manner that allows for easy comparison. I put together a map consisting of Land Values, the shapefile for the subdivision, Parcels, Streets and Easements. I used the command "Join" table and "Select by Location" among others to devise the map below:




The fundamental question to answer when viewing this map is, are there parcels that appear to not conform to the assessed value of properties in this subdivision? As I look at this map I see two properties (accounts) that the answer to this question could be "Yes."  The two properties should be reviewed to see if the assessed values conform to good practices.  There may be good reasons for variance in property value such as improvements on the property to raise the value (such as a $6,500.00 Dog House) or items that detract from the property value. However, in this case, at first glance the property in Yellow, to the left, lower, quadrant (071S312000001001) appears to be under valued compared to the majority of properties in the subdivision. While the property in Red, to the north and slightly off center (071S312000013001) appears to be over valued. Indeed, I believe it would be appropriate to review these two properties.

But since a picture is really worth... a lot of words; allow me to show you what I mean:


The two Parcels in question to be reviewed.

This was a good exercise to experience what our local appraiser's office does on a daily basis. I enjoyed doing the research and compiling the map for this particular type of property investigation. I can't wait to really be involved in this area of GIS work!

Participation Exercise Part 1- Urban Planning

This week in GIS4048 - GIS Applications - besides our Lab on Urban Planning, we are also working on a Participation Exercise. The objective of the Participation Exercise is to explore how a County or municipality, Appraiser's office conducts business relative to assessing parcels for tax purposes. For the exercise we looked up a recent home sale in our area and foraged for the information I will show you below.

 Since I live in Brevard County I selected the Brevard County Appraiser's Web Site to do my research.Dana Blickley, CFA Brevard County Property Appraiser.
It is a very nice web site that is currently being updated- they have a beta site that has a few more features- but generally, this site has all the info you need.
I did run into one substantial challenge: I could not find a single home sale posted to this site for June 2015. So I wrote to the Appraiser's office to see if I was missing an obvious search method that would help me find what I needed; the highest priced home that sold in June 2015. Here's what I got back:

Jim Brandenburg jim.brandenburg@bcpao.us

9:19 AM (7 hours ago)
to me
Good morning Mr. Castillo;

Several things come to mind.
1)      Sales often lag behind for a variety of reasons. First, the transfer may not get registered with the Clerk’s office immediately. It may not be processed and made available to the Property Appraiser immediately and we also have certain lag time internally. The net result is that, right now, our most recent sale is from mid-May, 2015.
2)      We also distinguish between “qualified” and “non-qualified” sales… a qualified sale is an unencumbered, single parcel transfer at market price. A non-qualified sale can be just about anything else, from a person transfers a property to a friend or relative for a dollar all the way up to a large commercial sale involving multiple parcels, liens, payments in kind, problems with the legal descriptions… etc.  It seems likely that you would be interested in qualified sales only.

Well, lo and behold, I wasn't off my rocker and I do not lack the skills to do a decent search for current home sales; there just wasn't any data for me to find. So, I elected to use a poperty from June 2014, and here's the information I collected:

First question, Q1:  Does your property appraiser offer a web mapping site? If so, what is the web address? If not, what is the method in which you may obtain the data?
A: Indeed my property appraiser has a great web site, and I posted the link above. But just in case, here it is again:





Question 2: What was the selling price of this property? What was the previous selling price of this property (if applicable)?   Take a screen shot of the description provided to include with this answer.

A: Let's start with the link to the beautiful property in question: Brevard County Parcel ID27-37-03-02-*-12 and Tax ID 2735118 (To find the parcel, enter the Parcel ID or Tax ID in the QUICK SEARCH block after the web site opens). This fabulous home sold for $1.26M on 06/26/2014. Prior to that the home sold for $1.25 on 08/14/2009. Regarding the description of this home; just in case the link above does not work, here's the image and abbreviated description as well as other pertinent information:




Next question please: What is the assessed land value?  Based on land record data, is the assessed land value higher or lower than the last sale price?  Include a screen shot. 
A: The assessed value is given as "Assessed Value Non-School" and "Assessed Value School". 
The table below shows this, but the assessed values for 2014 are: $828,720 (non-school) and $828,720 (School). Though here they are the same value I did see other homes where the value differed slightly. These values are significantly lower then the June 2014 sale price, but this may be because the assessed value includes a Homestead exemption or two and other variations that decrease the value for tax purposes.

  
The web site had many interesting tidbits of information. I liked that you could select a layer view and choose which layers you wanted to see. Additionally, there were Advance Search techniques and Settings that anyone could manipulate. I copied this link to a Detailed Report that I found interesting. I hope you enjoyed this Participation Exercise; I certainly did!












GIS4048-Urban Planning

This week was very busy. For GIS 4048, we worked on Module 9, Urban Planning. However, there was the matter of a Participation Exercise, that I will discuss in a later post or two. For now, let's discuss this Lab: Urban Planning: GIS for Local Government- Scenario 1 & 2. As you might guess based on the title alone, this looks like a rather long and involved lab. The overall objective was to become familiar with how a local municipality conducts business relative to property appraisals and zoning. This turned out to be more fun than I anticipated. I found this Lab highly interesting because I am currently volunteering with the Brevard County Survey Department located in the Government office complex in Viera (an unincorporated section adjacent to Melbourne, Florida). One of the functions I have observed while volunteering is the request to Vacate certain easement restrictions on various parcels. Usually, it seems, this happens because someone wants to put in a new swimming pool and it may encroach on a neighboring parcel or an easement for access.

For this lab we had to research surrounding parcels for a local developer and provide a report. The customer, Mr. Zuko, wanted to know the type of zoning, and the owners of the surrounding properties to the parcel in question. The best part of this lab was learning how to use Data Driven Pages. Data Driven Pages make it possible to provide much more information in a map book utilizing several pages of maps that can tell a story or convey a great deal of information in a uniform and consistent manner.  Below is one of the pages from my Map Book showing the customer's Parcel:



My entire Map Book for this Module is available for review at:   

The process for creating the Map Book was similar to what I have done in the past to create a complete Map. I did the research on the Marion County Property Appraiser web site; I collected information about this parcel (14580-000-00) and I put together all the details on the above Map Book. The big difference was in using the Data Driven Pages Tool. Using Dynamic Text and Making the Locator that you see in the lower right hand corner of the map was also a very cool experience. Another report I "delivered" to this customer was the written Parcel Report. I created this by using the Create Report feature at: attribute table>Table Options> Reports> Create Report 
Here's the report I created:


The entire report is available at:


There were several questions along the way that I had to address. The easiest way for me to present the questions is by listing both the question and my answer for each of the Scenarios:


Scenario 1: Marion County, FL

1. When was the data certified?

A: On the web page that has you “agree to continue” to the “Search engine”, the first sentence states: “Certified (2014) data represents certified assessed values provided to the Tax Collector and used in generating the 2014 tax bill. The 2014 Assessment Rolls were certified to the Tax Collector on October 16, 2014.” The answer is, the data was certified on October 16, 2014.


2. Who is the owner of the parcel and what is the acreage?

A: Once again, exploring this web site and reviewing the portion regarding “Property Information, at the top of the record, the answer is:

HAWKER INVESTMENT TR

C/O JTP FILMS INC

801 N BRAND BLVD # 665

GLENDALE CA 91203

is listed as the owner of this parcel. The acreage for this parcel can also be found at the top of the page to the far right. Acreage is listed as: Acres: 19.46


3. What two types of zoning are listed for this parcel? Include the classification description.

A: On this web site page, a heading for Zoning is found under “Land Data – Warning: Verify Zoning” The zoning listed here for this property is: A1, A1, A1, A3, A3

However, to find the definitions of A1 and A3 Zoning, I had to “hunt” the web site. I began with the "Home" Page and cycled through "Meet the Property Appraiser", Duties of Property Appraiser" "Property Search" (agree to terms , again) "Map It" and finally, "Sales Search" where I found a link to Zoning that opened a “Zoning Codes”, page. Here I found that:

A1 is for General Agriculture and A3 is for RES Agriculture Est. Though I could have just Googled “Zoning definitions”, I am not aware that there is a standard or a universal definition of A1 or A3 zoning. So, I was reluctant to take that shortcut and instead went on the “hunt.”.


4. What is the value of the dog house on this property?

A: Really? The value of a dog house? I found a great drawing of the buildings and the layout on the property. At the very bottom of the page was a section that reads: “Planning and Building, County Permit Search” and there, the second line item from the bottom, is “Dog House.” This must be one fantastic dog house. The permit search area states the Amount as: $6,500, so according to the Permit issued, the value of this dog house is $6,500.00.


5. How many records were selected?

A: I used: Selection> Select By Location > first drop down=select features from and I checked the box next to Parcels_Join. Only show selectable layers in this list, was already checked. I set the Source Layer = Zuko_Parcel, and I selected for Spatial selection method= are within a distance of the source layer feature. Finally, I put .25 and miles for the search distance. The "selection" found 67 out of 643 parcels or features. Therefore, 67 records were selected.


6. List the zoning type(s) with description found in the parcel area.

A: I found mostly A1 or A3 zoning types. A1 is for General Agriculture and A3 is for RES Agriculture Est. However, I did find other types of zoning including: B1, Neighborhood Business, B2, Community Business, B5 which is, Heavy Business, P-MH, Mobile Park, R1, Single family Dwelling and R4, which is for Residential Mixed Use. I made the below table and placed it on my Zoning Map for reference. 



Scenario 2: Gulf County, FL 


7. How many parcels are owned by Gulf County?

A: To get to this point, I started an editing session to annotate the newly created polygon, Object ID number 16911, to show that it too was owned by "Gulf County". Then, I used the Selection by Attribute expression "OWN_NAME" LIKE "GULF COUNTY" as provided in the Lab instructions. The Selection revealed that there are 75 parcels owned by Gulf County.


8. How many land parcels are greater than 20 acres?

A: I manually reviewed the parcels and found only a few that were greater or equal to 20 acres. After running the query, (provided in the Lab instructions: “Use the “Query Builder…” button to create the following expression: Acres >= 20 “) I found 12 parcels that were both Gulf County owned and equal to or greater than 20 acres.

Overall, this was a great Lab and I can't wait to have a job doing this kind of work everyday!

Friday, July 10, 2015

Explore/Manipulate Spatial Data

This has been a difficult week. As we approach the Final for GIS Programming, I am having feelings of apprehension regarding my Python coding skills. This week we worked on writing a script that would create a new geodatabase, copy data from one folder into the new geodatabase and populate a dictionary.

The reading included Chapter 6, "Exploring Spatial Data" and Chapter 7, "Manipulating Spatial Data" from our text book, by Paul A. Zandbergen.  This was all good information and appropriate sample code. Yet, I still had hurdles I could not surmount.

Creating a new geodatabase wasn't that bad:

>>>arcpy.CreateFileGDB_management("S:/GISProgramming/Module7/Results", "gcastillo.gdb") # Creates new gdb.

Copying data from one folder into the new geodatabase wasn't even that bad:

>>>arcpy.CopyFeatures_management(fc, "S:/GISProgramming/Module7/Results/gcastillo.gdb/" + fcdesc.basename)


But, I have not been able to populate the dictionary properly and then get it to print out. The "Good People" from GIS5103, Discussion Borad Module 7, even posted the following:
                           
---To populate dictionary---
county_seats.update({row[0]: row[2])
name = row[0]
population = row[2]
county_seats.update({name: population})
    print county_seats

Yet, as many times as I ran my script, this is all I could muster:


I am posting this now because I really do not like to wait til the last possible minute, and in fact, I did not procrastinate on this assignment. I just could not get the correct result. Here's to better luck next time...cheers...

Thursday, July 9, 2015

Location Decisions - Where to Live

Almost there....stay on target...we are at the penultimate week for GIS4048. Well, of course, then there's the final project...but we won't discuss that right now.

For this Module we are working on how to use the Weighted Overlay Tool and conduct a thorough analysis for a couple that is moving to Alachua County.  But this is not just any couple; more on this later.

Our goal was clear; Find the perfect location for this couple's new home. So. here is where the challenge comes in. The criteria is:

The new home location must be close to North Florida Regional Medical Center (NFRMC)
The new home location must be close to UF
The new home location must be a neighborhood with a high percentage of people 40 to 49 years old and
The new home location must be a neighborhood with high home values

So, where to begin?

I put together my map beginning with a Base Map of Alachua County. After this I learned how to use the  Euclidean Distance tool and applied it to NFRMC and UF. I had to Reclassify the layers (Reclassify Tool) to make it possible to interpret and compare the results later. I also calculated the Percentage of Population Aged 40 - 49, and Homeownership, since these were also my "clients" criteria. I summarized the data and presented my map to them:




The above map could be used to discuss the four criteria with my "clients", but we are not done with this hypothetical scenario, It seems my "clients" discovered traffic in Alachua County and want to be very close to work to shorten their commute (they should see the traffic in Los Angeles- I learned to drive their). She is a doctor and he is a professor. They are smart and determined. It will be hard to please them, but I am going to try.

So, the next phase of this map creation was to add Weighted Overlays. This was really interesting to me. The Weighted Overlay Tool was very cool. This tool allows you to take layers and give more "influence" to certain data or criteria. I sat down and had a deep discussion with my "clients":

          The couple is very concerned about the commute to work. For Prof. UF, he knows he will have a set schedule that will only vary with the classes he is teaching. In fact, he expressed that when not in class, he could work from home. Mrs. Doctor, may have to go to the hospital at various times of the daydepending on medical emergencies and patient obligations. Therefore, living very close to NFRMC is the most important factor followed by homeownership. The age group was not a concern and in fact, younger or older neighbors made no difference to them by this point. Taking into consideration these factors, my weighting for my final map was:

Reclass_Own=20
reclas_Age=10
reclas_UF=10
reclas_hosp=60

I also used Model Builder to crunch the data with the tool.

In the end, I believe my "clients" will find this map very useful in determining where they will live, work and play.



Thursday, July 2, 2015

Homeland Security: Protect and The Boston Marathon

We are wrapping-up our study on MEDS (Minimum Essential Datasets) and the Department of Homeland Security. This week we completed our maps using the Military Template by compiling layers of data on Critical Infrastructure around Boston for the 2012 Marathon. The overall task was to identify Critical Infrastructure, such as Hospitals, School, Dams and Airports and to analyze various Line-of-Sight locations around the finishline where security cameras could be placed. There were quite a few tasks to accomplish for this lab, such as, Analyze Data; add the MEDS data we compiled last week to our scenario map for this week; create a buffer around the event site; create a security buffer around critical infrastructure; identify and secure ingress and egress routes; generate hillshade; create surveillance points; generate viewshed; create line-of-sight graphs and a view in 3D. This last part was very cool and I learned quite a bit doing this so I wold like to list my actions and steps for this last part:

1-       To Create the line-of-sight profile graph, I had to first find where the tool was
a.       I asked myself, “Where is this?  Is this on the Draw toolbar...???” I had to Google ArcGIS Help to confirm where this “Tool” was located.
b.      I selected the Draw Arrow (next to the Drawing dropdown/pulldown window directly under the "Draw" title). I thought, “this can’t be that hard.”
c.       I thought wrong.

2-       Initially, the option was not available for me to select. That is, I could not select the “Profile Graph” on the 3D Analyst toolbar.
a.       Finally, the Create Profile Graph became available for me to select and it was not greyed out.
b.      I think I had to be in the Layout view not Data View.

3-       The Blue handles appeared when I double-clicked on the point. However, I did not need to double click. When I did this a second time for step  7 number 9. I clicked one time and the blue handles appeared and a box was placed around the line.
4-       After making my box around the surveillance point of my choice, the Graph popped up and I was able to select properties and enter a title and subtitle. I did this for several points.
5-       I exported the Graphs for later use and saved what I had before moving on to the next step.

      To create the View in 3D, I had to use ArcScene
a.       The first time I did this I did not notice that the ArcScene icon was on the 3D Analyst toolbar, so I opened ArcScene from my desktop.
7-       I added the layers as instructed and I recall thinking, I vaguely remember doing something with the Base Heights tab a long time ago...
8-      I almost missed the step, “Make sure the Factor to convert layer elevation values to scene units is set to Custom 1.0 and click OK” but I caught it just before exiting.
9-      Then it was back to ArcMap to select, copy and paste the line-of-sight from each surveillance point. This was difficult at first to get to work. However, after I did this a couple of times, all was good.
10-   This was an excellent learning point going back and forth between ArcMap and ArcScene.
a.       I saved my work and tried to export it as a 3D file (finishline_lineosight_gc.wrl). But, this did not seem to be what I wanted

b.   So, I tried again and selected export as 2D and then as a .jpg (finishline_lineosight_gc.jpg)—much better this time.

11-   After this it was time to compile my map.

This was a very involved lab that took two weeks to complete I learned a lot these past weeks and I am sure I will continue to advance in my skills and knowledge.