Category Archives: geospatial

Related to spatial geometries and libraries

Less is (almost always) more

This morning, I came across a tweet linking to a National Geographic article on data visualization and eye candy, which I emailed to a few friends (when we want to carry on a more in-depth conversation Twitter just doesn’t cut it):

http://news.nationalgeographic.com/2015/09/150922-data-points-visualization-eye-candy-efficiency/

“Find a metaphor, start simple, and build up from there.”

One, who works at an economics research organization, replied with:

“We have lots of tools that allow relatively informed users to explore data and answer questions that they have, but few of our visualizations actually stick to a single story line.  We’re trying to improve the balance of exploratory tools vs. simple, compelling stories.”

My response:

That’s highly similar to the approach often taken with interactive mapping interfaces – either attempting to duplicate desktop GIS functionality or show a particular facet of data with spatial attributes. Finding the balance between them is tricky. Generally, end users want to answer one or two questions though.

The trails web app I linked to recently – https://helenamontanamaps.org/Html5Viewer/?viewer=trails – is about as far as I’d ever go towards the GISFunc side of things anymore (there are a few gee-wiz, that’s cool features like mixed transparency for the base layers…but are they really necessary in most cases? No way).

http://mapbrief.com/2015/01/09/daring-to-build-a-citizen-centric-homepage-the-philadelphia-story/ is one of the best pieces I’ve read on user-focused functionality.

Incidentally, I read that NatGeo article on my phone and many of the visualizations were too small to be intelligible. For some reason, this one on landslides stood out to me as good on the phone (although on my desktop monitor the map background is barely visible):

landslides

A couple days ago, one of these correspondents sent me a link to a draft page showing all kinds of data related to load times, download sizes, CMSes used, and number of subscribers for a raft of news publications. I can’t share that page in its current state but will say that I wrote back encouraging him to make it way simpler. Just ’cause Tableau gives you a kitchen sink toolbox doesn’t mean you have to use it.

 

Real-time mapping with SignalR and a client app

A couple months ago I watched a live seminar given by @bradygaster and @jongalloway on SignalR. During their presentation, they demoed a web mapping app where all the attendees could share their locations with anyone viewing the web page. It occurred to me that capturing GPS data from a phone would be a good test of SignalR’s capabilities – namely how quickly and reliably it could transfer rapid bursts of data. I put together a web app with a Bing Map and a barebones Windows Phone client app that sends GPS data (lat, lon, altitude and speed) to a SignalR hub running in the web app.

Every time data is received the web map displays a red dot. Every 30th time it displays a custom marker showing altitude and speed. If the data is coming from the phone app, this translates into approximately every 300 meters. Using the apps while driving was interesting in that it graphically pointed out gaps in cellular data coverage between Helena and Missoula, Montana. While running or biking it was also interesting – to a certain extent. rtr2 I actually left a screen capture video running during one bike ride. On the descent I reached 48 mph. Yet reviewing even that portion of the video later was like watching paint dry. So if you’re thinking it would be exciting to watch your friends run in real time on a map, well. That isn’t to say that something like this doesn’t have applications. On the commercial side, while the developers of SignalR make no warranties to its suitability, an app running in the background could show delivery driver or utility worker locations. Or as your SWAT team is moving into position…uhm, in light of current events we won’t go there. Need to know where your kids are?

Personally, I’d like to be able to fire up an app, send an email to one or more friends to check a link so they know where to go to help me get an elk off some rocky, heavily timbered mountain (I actually sent a screenshot off the phone app to a friend more because I wanted someone to know where I was this year since it was -10F. In that situation, of course, the best and first thing I did was build a healthy fire).

In the interest of getting this post done, I’m not going to go through just a few of the project steps. I’d encourage anyone with Visual Studio to clone https://github.com/stonetip/realtimeRun, which contains the complete project code aside from Nuget packages (just do a restore). A couple of things:

  1. Run VS as Administrator, particularly if you’d like to create a virtual directory for the web app. This allows testing from a device, which is nice if you actually want to walk around within range of your wi-fi router.
  2. While it’s great to use the Location tab/map in a Windows Phone emulator, it’s a pain to get the emulator to see a local machine on your network. I recommend setting up a test site on Azure and deploy to that (you’ll need to configure the URL in the Windows Phone app to point to it). Then the emulator will work AND you can play with the app anywhere you have a wi-fi or cellular data connection.
  3. SignalR supports Websockets. IIS8 does too, but you’ll need to enable it under Windows Features:enable_ws

    If using Azure, you’ll need to visit the CONFIGURE section for the website inside the Portal and enable it. If you don’t feel like doing either, no big deal. Other protocols are fine. SignalR will intelligently pick one.

 

The Web App

Creating  a SignalR web app is fairly easy (provided you’ve added the requisite Nuget packages). In an Owin-based web app like this one, a single line added to the Startup.cs file will kick it off:

app.MapSignalR();

The next step is to create a Hub class. In my case, this is a simple class that receives data from a client and passes data to all clients that have the client method “broadcastLatLon”:

using Microsoft.AspNet.SignalR;

namespace RealtimeRun
{
    public class MapHub : Hub
    {
        // The Send method is used on a client to deliver data to the server
        public void Send(string lat, string lon, double? altitude, double? speed)
        {
            // In return, the server broadcasts the data to all clients.
            // Those that have the broadcastLatLon method will use the data on the client
            Clients.All.broadcastLatLon(lat, lon, altitude, speed);
        }
    }
}

That’s it for the backend code. Everything else in the web app is done in the client page, in this case map.aspx. One thing you’ll need is to provide a Bing Maps API key (info here).  You could, of course, adapt this sample to use Google Maps, HERE Maps or any other web mapping service. We’re just dealing with client and server methods that are sending text strings – nothing mysterious or requiring a masters in GIS.

Note that in this app I use a secondary config file, called AppSecrets.config to avoid using my key directly in Web.config and have that exposed in the Github repository (see http://www.mattburkedev.com/keep-your-azure-secrets-safely-out-of-git/). BTW, if you’re using Visual Studio with Git, I highly recommend using the Github VisualStudio.gitignore file. It’s chock full of exclusionary goodness. The Web.config file contains a commented-out section showing what the contents of AppSecret.config should look like.

The map.aspx file shows a map along with a convenient test button (which will place a random marker within a few kilometers of your location each time its clicked) and a line showing the coordinates, altitude and speed: web_app Once you get the app up and running, try clicking the daylights out it. This will give you an immediate visual indication that things are working. I won’t go into more detail on the map.aspx file. The code is well-commented.

Update

I was asked what would happen if more than one person (or even one person using multiple browsers or devices) was sending data. All of it would show up on the map. That’s because this sample doesn’t differentiate between clients. SignalR does provide support for distinct users. Check out Working with Groups in SignalR for more on this.

 

The Phone Client

This is about as barebones a client as you can get…just enough UI to let you know its working. A couple of things you need to do

  1. Open WMAppManifest.xml and on the Capabilities tab, make sure that ID_CAP_LOCATION is active.
  2. In App.xaml.cs, add these variables and change or add these two sections to:
    public static Geolocator Geolocator { get; set; }
    public static bool RunningInBackground { get; set; }
    
    private void Application_RunningInBackground(object sender, RunningInBackgroundEventArgs args)
    {
        RunningInBackground = true;
    }
    
    private void Application_Activated(object sender, ActivatedEventArgs e)
    {
        RunningInBackground = false;
    }
    

Next we’ll turn our attention to the MainPage.Xaml layout. All that’s needed is one UI element inside the LayoutRoot:


Just a simple TextBlock to tell us the GPS is working by displaying coordinates, altitude and speed. In MainPage.xaml.cs, after adding or restoring the Nuget packages,  there’s not a lot to do. The SignalR part is minimal: a couple private vars, a start method (which gets invoked in the MainPage ctor upon Loaded, and a send method, which I imaginatively named SendMessage.

public async Task StartSignalRHub()
{
    try
    {
        _hubConnection = new HubConnection(SiteUrl);

        _hub = _hubConnection.CreateHubProxy("MapHub");

        await _hubConnection.Start();
    }
    catch (Exception err)
    {
        Debug.WriteLine(err.Message);
    }
}

public void SendMessage(string lat, string lon, double? altitude, double? speed)
{
    try
    {
        _hub.Invoke("send", lat, lon, altitude, speed);
    }
    catch (Exception err)
    {
        Debug.WriteLine(err.Message);
    }
}

Similarly, the geolocation portion consists of a start method to fire it up (which gets invoked either upon loading or navigating to the page):

public void StartGeolocation()
{
    if (App.Geolocator != null) return;

    App.Geolocator = new Geolocator { DesiredAccuracy = PositionAccuracy.High, MovementThreshold = 10 }; // 10 meters (to limit data transmission)
    App.Geolocator.PositionChanged += Geolocator_PositionChanged;
}

private void Geolocator_PositionChanged(Geolocator sender, PositionChangedEventArgs args)
{
    var roundedLat = Math.Round(args.Position.Coordinate.Latitude, 6);
    var roundedLon = Math.Round(args.Position.Coordinate.Longitude, 6);
    var altitude = args.Position.Coordinate.Altitude;
    var speed = args.Position.Coordinate.Speed != null && Double.IsNaN((double)args.Position.Coordinate.Speed) ? 0 : args.Position.Coordinate.Speed;

    Debug.WriteLine("{0}, {1} altitude: {2}, speed: {3}", roundedLat, roundedLon, altitude, speed);

    SendMessage(roundedLat.ToString(CultureInfo.InvariantCulture),
        roundedLon.ToString(CultureInfo.InvariantCulture), altitude, speed);

    Dispatcher.BeginInvoke(() =>
        TblockLatLonBlock.Text =
            String.Format("{0}, {1} altitude: {2}, speed: {3}", roundedLat, roundedLon, altitude, speed));
}

Note in StartGeolocation that the MovementThreshold has been set to 10 (meters). I did this because I wanted to generate a reasonable amount of data but not dump tons of points onto the map. Similarly, in the map.aspx page, a label is generated only every 30 points, so if the data is coming from the phone app that means about every 300 meters. As mentioned before, it’s usually easier to test and debug if you have published the website to Azure or anywhere capable of running the web app which has a resolvable domain name. If you have done the voodoo that allows your local machine to be visible to your Windows Phone emulators, that’s great too. I just never have time to mess with that.

Whatever you use, the SiteUrl string needs to be set to the root of that site, e.g. “http://myMapTest.azurewebsites.net/” so that the phone app has somewhere to send its data. When everything is set up, open a browser window to the map.aspx page. With the phone app as the startup project, launch it for debugging and try it out. Under Additional Tools in the emulator, if you select Location, you can set a series of points and then replay them. For example, I’ve create a reenactment of the historic canoe route from my former digs on Juanita Dr. in Kirkland, WA to Spud’s fish ‘n’ chips. (The return trip usually was much slower, but paddling hard alleviated our guilt over chowing down on all those fries):

juanita_dr

Hit play and watch the epic voyage unfold in the web browser: spuds

That’s “Code” Comfort

I’ve been working on a mobile app that will support adding vector-based mapping polygons. Using California and Massachusetts to do a test on MULTIPOLYGON states and a map extent spanning the lower 48, I imported the data, ran the code and when I saw the visual, thought something was messed up:
How could the code be putting California at the top of the image? I went back to my source map and sure enough, the northern tip of California really does go as high as central Massachusetts. It occurred to me that it could be because of the Lambert Conformal Conic projection for the contiguous U.S. that I’m using. But it turns out the northern boundary of California is 42° and so is the southern boundary of western Mass (or at least it seems that was the intention at the time, given the quality of surveying and certain conditions facing surveyors – see the Granby Notch story (http://www.nytimes.com/2001/01/26/nyregion/driveway-another-state-blunder-1642-creates-headaches-today-for-homeowners-who.html) .

File:Masscolony.png

Incidentally, several states share 42° as an official boundary:

File:42nd parallel US.svg
Sometimes our notions of things, based on certain conceptions, distort reality. I can see where my perceptions of California as being in the southwestern part of the country and being mostly warmer than the chilly, northeasterly Massachusetts would make it hard to believe they could have boundaries near the same latitude. At least I know my routines work. That’s “code comfort.” Still lots of work to do!

Burn Map

https://maps.google.com/maps/ms?msid=217066668344035014417.0004df0ab0046cd029a58&msa=0

The area burning is in a dense Ponderosa forest known as the Black Forest. I never went around there when living in the Springs from 1992-96. But remember it as a kid (much earlier). It was really pretty. But you look at the maps and can see why all the houses there have been destroyed. The map that shows lost houses (http://gazette.com/article/1502250) is somewhat marred spatially because they used lot centroids rather than pinpointing the structures. When it’s all over, I’m going to suggest to their fire marshals (as well as “fire community” people I know) that when pre-mapping lots they have their GIS people identify the center of the largest visible structure. Why? Because when analyzing the data post-mortem or trying to educate the public it will show which structures were the most vulnerable.

Of course it isn’t as simple as saying,  “Being right under the trees will guarantee destruction. Being in a clearing will not.” Embers travel for miles. And often it’s some structural or landscaping flaw, e.g. dry vegetation right up against the house and under the eaves that causes ignition. An informative public presentation would show time lapse fire and wind travel superimposed on aerial imagery, along with contextual images of damaged structures.

My condolences to anyone who has lost a home in that area.

The “Evil” GEOMETRYCOLLECTION

Recently we encountered a bug in SQL Server’s spatial aggregation algorithms. Sets of polygons that should have been combined strictly into WKT (well-known-text) POLYGON or MULTIPOLYGONs were being stored as GEOMETRYCOLLECTIONs instead. Turns out the algorithm was inserting one or two LINESTRINGs (of two points each) into the collection, forcing it to be stored this way. The bug was supposedly fixed a couple years ago, but I’m going to submit a new bug and sample dataset to Microsoft. Because I’ve double-checked we’re using the latest version of the DLL, etc.

So why do I think the GEOMETRYCOLLECTION is “evil” (or at least don’t think highly of it)? Primarily because it caused our customer to question both the integrity of our software and the use of SQL Server to store AND manipulate spatial data instead of relying on purely an ESRI ArcGIS solution. We were unable to open the SQL Server datasets via ArcCatalog or ArcMap (using the Database Connection in 10.1). On a broader note, I tend to side with ESRI in not supporting more than one data type in a layer, so I’m perplexed that the creators of the WKT standard even thought up such a datatype. I can see where it might make sense to group related geometric objects but believe there are better ways to do that.

I’m tempted (hey, in a post where the word evil is mentioned it seems appropriate word use) to say that this never would have happened in the first place if a GEOMETRYCOLLECTION wasn’t a possibility. But I know that even if the bug gets fixed, Microsoft and others will still adhere to the WKT specification for storing data and ESRI will continue to invalidate coverages that contain multiple datatypes. So we will improve on the hack below to ensure that what gets output in the end is valid by our customer’s definition.


// Well, semi-verified anyway. At least it won't contain any LINESTRINGs.
public SqlGeometry VerifiedPolygonGeometry(SqlGeometry inputGeometry)
{
 SqlGeometry outputGeometry = null;

try
 {
 var goodReservoirGeometriesList = new List<SqlGeometry>();

int numGeometries = (int)inputGeometry.STNumGeometries();

for (int i = 1; i <= numGeometries; i++)
 {

var geometry = inputGeometry.STGeometryN(i);

if (geometry.STGeometryType() != "LineString")
 {
 goodReservoirGeometriesList.Add(inputGeometry.STGeometryN(i));
 }
 }

var agg = new GeometryUnionAggregate();

agg.Init();

foreach (SqlGeometry geometry in goodReservoirGeometriesList)
 {
 agg.Accumulate(geometry);
 }

outputGeometry = agg.Terminate();
 }
 catch (Exception exception)
 {
 Debug.Write(exception.Message);
 }

return outputGeometry;
}

Now That’s a Detour

Here’s a screenshot from Flightradar24. It’s typical of flights between Beirut and Amman. I can see why, given the situation in Syria, flights wouldn’t go that way. But I’d be curious to know if planes detour around Israeli airspace because they aren’t allowed there. Or is it because of some Arab political decision?

beirut_amman

I find the app fascinating, especially when you see a flight heading right over Helena that’s going from somewhere like SFO to Dubai. Or even the Fedex flights from Anchorage and everywhere else that all still flock to Memphis. I wonder, despite the package processing infrastructure there, why someone hasn’t come up with a more efficient set of routing algorithms to use the least effort to get packages from A to B. Amazon Air anyone?

Lovin’ Me Some Cheap Libraries

In my MT State Library days, we worked with double metaphone in conjunction with the GNIS placenames data we had. It was remarkably accurate. I see now that the author of that algorithm has a new version, which is available as a C#, C++ or Java library for $40…quite reasonable for what it does. One of the problems I’ve had with open source is that it sometimes precludes decent code from being propagated because the developer has no way to at least receive some modest compensation. On the other hand, I come across insane prices for code libraries all the time.

What I like is something such as this where it makes sense to purchase because it works and it is so cheap. Recently, I paid $35 for a DLL that does datum shifts (moving lat/lon coordinates from one geodetic model to another). The code was available (after much digging) in Java. But I would have spent 3-4X that amount converting and testing it.

No affiliation with this company: http://www.amorphics.com/

Using SMO to Create a Spatial Index

With the advent of ArcGIS 10.1, you can now direct-connect in a read-only fashion to spatial data on SQL Server. That has worked well and while the read-only aspect means no editing, it also means no locks on the data, which is a problem with File Geodatabases. I’ve found that I can programmatically refresh or rebuild a table in SQL Server and just by panning or zooming have ArcMap redraw it. I’d never be able to leave an FGDB open in ArcMap while trying to do the same.

However, I ran into one problem with really large datasets brought into ArcMap. They couldn’t be exported to FGDB or Shape format without a spatial index on the table in SQL Server. Understandable and really it’s better for performance if a spatial index exists. Since it’s an application that creates the tables and populates them, it fell to the application to also create a spatial index.

I could, of course, have used a parameterized SQL command to do this, but since I’m already using SMO to create a tabular index thought I’d try it for a spatial geometry column. Of course, as I soon realized, there’s a catch…one that almost makes it just as well to use parameterized SQL. But if you like being able to specify things like:

 SpatialIndexType = SpatialIndexType.GeometryAutoGrid 

as needed, then there is some case to be made for mixing SMO with SQL…and mix you have to. Because here’s the catch, you can’t create a spatial index if you don’t already know the bounding box (or envelope) coordinates. The only way I know to do this is via a SQL query to the server first. So here’s a code sample:


var serverConn = new ServerConnection("servername");

serverConn.ConnectTimeout = 180;

// provide appropriate login credentials here

var srv = new Server(serverConn);

Database db = srv.Databases["tablename"];
var tb = db.Tables["tableName"];
try
{
 if (db != null)
 {
 // Perform spatial query to get the bounding box
 var sql = String.Format(@"SELECT
 geometry::EnvelopeAggregate(GEOM).STPointN(1).STX AS MinX,
 geometry::EnvelopeAggregate(GEOM).STPointN(1).STY AS MinY,
 geometry::EnvelopeAggregate(GEOM).STPointN(3).STX AS MaxX,
 geometry::EnvelopeAggregate(GEOM).STPointN(3).STY AS MaxY
FROM {0}", "tableName");

var dataSet = db.ExecuteWithResults(sql);

if ((dataSet != null) && (dataSet.Tables.Count > 0) && (dataSet.Tables[0].Rows.Count > 0))
 {
 var boundingBoxXMin = (Double)dataSet.Tables[0].Rows[0]["MinX"];
 var boundingBoxYMin = (Double)dataSet.Tables[0].Rows[0]["MinY"];
 var boundingBoxXMax = (Double)dataSet.Tables[0].Rows[0]["MaxX"];
 var boundingBoxYMax = (Double)dataSet.Tables[0].Rows[0]["MaxY"];
 //spatial index
 var spatialIndex = new Index(tb, "Spatial" + tableName)
 {
 SpatialIndexType = SpatialIndexType.GeometryAutoGrid,
 BoundingBoxXMax = boundingBoxXMax,
 BoundingBoxXMin = boundingBoxXMin,
 BoundingBoxYMax = boundingBoxYMax,
 BoundingBoxYMin = boundingBoxYMin,
 CellsPerObject = 16,
 PadIndex = false,
 CompactLargeObjects = false,
 DisallowPageLocks = false,
 SortInTempdb = false,
 OnlineIndexOperation = false,
 DisallowRowLocks = false
 };

spatialIndex.IndexedColumns.Add(new IndexedColumn(spatialIndex, "GEOM"));

spatialIndex.Create();

return true;
 }
 }
}
catch (Exception err)
{

Debug.WriteLine(err.message)
}

Tripped up by relying on too little data

I was basically copying algorithms over from one project to another, but found I wasn’t getting results. The problem was a number of significant changes were made to the data. So I was looking at some daunting debugging. But then it turned out to be a case of fixing one line of code:


//var coordPattern = new Regex(@"[0-9]+.[0-9]+,[0-9]+.[0-9]+");

var coordPattern = new Regex(@"[-0-9.]+,[-0-9.]+");

The old regex illustrated the pitfalls of working with one particular dataset, in this case spatial coordinates in a UTM projection. Then I switched to Web Mercator. Using that projection, the coordinates now include negative values, such as -10539358.2537, 3394430.3346999986 (space added for ease of reading). I also found that some values don’t include any decimal places. So the new pattern is looser, but still specific enough given what it will be fed.

What’s in a map projection?

If you take a look at the map below of the contiguous U.S. you might ask, “I see that Maine has the easternmost point in the lower 48, but shouldn’t it also have the northernmost point?”

Projections can sometimes be misleading. Well, they’re all misleading in some aspect or another. Take, for example, the Mercator projection, which is used in mapping services like those provided by Bing or Google. (There are sound technical reasons for this BTW). It greatly distorts areas the farther north or south you go. However, in this case, it does make it easy to see that the northernmost point of Maine is south of quite a bit of the U.S.-Canadian border:

So, how far north does Maine go? Approximately 47.46°, which puts it south of that large stretch of the border that runs along the 49th parallel. Incidentally, Sumas, WA – because of a surveying error – extends just slightly past the 49th parallel (49.002389,-122.261111).