April 20, 2007

Data Dictionary Integration & Uploading

The two items listed in the title are very different projects, but are both vital for this project.

The data dictionary has now been integrated into version .5 ! You will now be presented with a button to show/hide a list of attributes that the data dictionary associates with that shape file. You may then check which attributes you want and click the "Get Selected in KML" to have the site generate a KML file for the given boundries and selected attributes.

We now have an upload page that allows users to submit their own files to the browser. A user must provide their name, team, a password ('password'), a file, and a title for the file. Additionally they may enter a latitude, longitude coordinate pair, or an address. If neither lat/lon nor address are provided we will automatically assign the file the polygon matching their team. For lat/lon, we make sure that the coordinates are atleast within a reasonable range of Detroit. The address is graciously geocoded for free by geocoder.us. These entries will be shown on the visual browser in a shapefile block called Usersub. If we add usersub to the data dictionary then a more appropriate title can be displayed. One problem is that for some unknown reason we can't upload files yet. Once we can do that we can also generate links to the files for download. Yeah!

Posted by shawse at 02:10 PM | Comments (0)

March 23, 2007

Organizing Output

Another day, another version... yes, version 0.4 of the visual browser is up and running (http://localhost/visbrow/v1/map_explorer4.php). In this version when you press the button, you will be presented with a link to download the data for all the returning shapefiles into a single KML file. Then, the first shape file title is presented, below it is a link to download just that shape file's data in a KML file (named with the name of the shape file), and then the name of each of the data points is listed with a centroid (new feature in of itself). We then continue on to the next shape file and so on.

The next step is to implement the web data dictionary so we can integrate it as well and then, when we get it, integrate GeoServer (which works on the slow machine).

Posted by shawse at 12:47 PM | Comments (0)

March 22, 2007

Getting stuff into Google Earth

I spent this morning working on a little script that will take a bounding box and list of shape files and then plop all the data into a file called Visbrow.kml ready for download. Each shape file has its own folder within the KML and all the shapefile data is put into the description field for each individual item.
The handy little script is here: localhost/visbrow/v1/2kml.php
An example URL would be this: localhost/visbrow/v1/2kml.php?shapes=historic_point,parcels&bounds=-83.057637,42.333508,-83.044763,42.3420922 which would get you a kml file with all the parcel and historic_point shape file data (each in their own folder) for that bounding box. But, once again, you have to be logged in as Jen and run the XAMPP webserver to use it. When is our live machine going to be available??

Current plan of attack:

  1. Integrate the 2kml into the visbrowser so users can download stuff with a button click
  2. build web version of the data dictionary
  3. integrate web data dictionary with visual browser so users can select desired fields for download into their kml file. Come to think of it, I could even let them name the kml file.
Questions, comments, or concerns about this list?

Posted by shawse at 12:38 PM | Comments (1)

March 08, 2007

Historic Points

Yeah!!!
We can now successfully query a database and return the data to the map viewer page without refreshing the page! The current prototype, found here, requires you to be logged on as Jen so you can run the XAMPP webserver and query the database. If you click the “Submit Query� button the form will submit a request to a php script called shapelister.php which will then process the bounding box, query the database, and then return the resulting database rows. The prototype is currently checking against a copy of the historic_point shape file. So, click the button, and all the historic points that exist within the bounds of the viewer will be returned with their name and longitude, latitude coordinates.

The next step is to load the data from our shape files into a new table listing all of our geometries and their respective shape file. Once GeoServer is running we could also return the data as layers to display in the viewer and provide links to download the datasets as KMLs.

Posted by shawse at 12:06 PM | Comments (0)

February 22, 2007

EPSG and WGS and SRIDs, Oh My!

After some pain, sweat, and tears I have learned a lot about projections. For future reference, WGS84 is the projection commonly used for latitude & longitude, which is the same as EPSG:4326. The SRID of which is the same number, whereas the Historic points shape file uses SRID 2253 and NOT 26990 (long story). Fortunately we can use the ArcGIS Toolbox to reproject the data to WGS84 which will save us the headache of identifying each shapefile's SRID. We can now get out shape files to map to the proper projection, and then query across the data points using a bounding box (provided by the map explorer).

It appears that I am going to do some Perl script writting, however, to get all the data into the same place (actually three, but anyway...). As it is, our importer spits out way too much data. To compensate I need to write a Perl script (my kingdom for BASH) to extract the weat from the chaff. Because some of our shapefiles have point data (historic points), others line data (roads), and still others polygons (parcels) we need to store the data in three database tables and query them seperately. I can then take all the results and compile them together. I can make the import data script detect which one it is and select the appropriate table from there.

While you can currently see a version of the map explorer that shows the WKT polygon geometry to query the database against, the live machine has no database to query against. As concequence I set up a Dreamweaver site that uses the XAMPP tool to run a temporary webserver on the Novell machine. This way the php script has access to the database that stores our data for me to run tests against. - The small problem with this is that you need to be logged in as Jen to make it work.

All in all, progress is being made. My next task is working with the temp webserver to see if I can get the php script to effectively query the database.

Posted by shawse at 12:07 PM | Comments (0)

February 12, 2007

VisBrow - the beginning

At last, it has begun. Most of my time was spent reading docs to start getting my mind wrapped around the problem and the resources available.

I was thinking that a simple process for the app could be:

So far I can:

Next up:

Posted by shawse at 01:54 PM | Comments (0)

January 20, 2007

Some thoughts after meeting with Jennifer, project coordinator

So we've identified two products we'd like to produce for the Workshop at this point - 1)A data finder help sheet and 2)A website to browse the (created) map images and hopefully, the data elements included in those map files.

Some of the things the website needs to do are:
show images of maps (samples) - to act as identification points, link to the report pages
deal with the interrelationships between variables
image (representing the mxd file)
attributes
layers
and subjects

I'm thinking that the two most important elements (the first to tackle) will be getting the image and the attributes together. The title of the map should also be there somewhere (since people tend to attach title to image if there is one?)

Adding pointers to layers are an added bonus

Posted by greenjen at 04:41 PM | Comments (0) | TrackBack

January 19, 2007

Charette Cache

I just put the finishing touches up on our first set of the Charette Cache Loaders.
The first is the "Perim Path" path which sweeps over the extened neighborhoods of interest for the Charette (as indicated by the provided maps).
The second is the "Core Area Cache" path which sweeps over the core of downtown Detroit to grab better images.

Both require the user to set touring options to certain settings such as "camera range" (the distance between the camera and the ground). In the case of Perim Path the camera range must be between 1.5 Km and 2 Km. Whereas the Core Area Cache camera range works best around 300m. The difference in the camera range makes the detail of the images grainier, but sufficient for a general idea of this extended area.

They are by no means perfect although they do the job perfectly well. Any suggestions for improvement are welcomed.

Posted by shawse at 11:22 AM | Comments (0) | TrackBack