We are thrilled to welcome Nadia Barbosa to Mapbox! Nadia is joining us on the Support team, answering troubleshooting questions and working on our Help documentation to give our users the best possible experience using Mapbox.
Nadia joins us from an environmental nonprofit in Washington, D.C., where she provided technical support and mapping expertise to teams across the organization. She is an alumna of the University of Maryland and American University.
We’re delighted to welcome Erin Quinn, who is joining the Mapbox business team in DC! They’ll be diving in to work with enterprise customers and help bring new companies to the Mapbox platform.
Prior to joining Mapbox, Erin studied history while rowing for the University of Dayton, worked as a technical recruiter, and ran a social enterprise rafting & climbing company for a non-profit in Colorado.
The geographic footprint of Boston has changed drastically over the past several hundred years. Luckily, there are many carefully drawn historical maps that have preserved the various stages of the changing landscape. In this map, I brought those historical coastlines back to life by overlaying them onto an interactive map of present day Boston.
To make this map, I first extracted the coastline geometry from the map images. I used maps from 1788 and 1898 from the Harvard Geospatial Library that are georeferenced and in GeoTIFF format. After uploading them as tilesets in Mapbox Studio, I used the maps as reference tilesets in the dataset editor and traced the coastlines by hand.
To highlight the changing coastlines, I did a figure-ground flip to the traced land and island polygons using Turf.js, so the map shows historical waters flowing over streets and places of today.
Use the dataset in a map
Next, I converted my historical coastlines dataset into a tileset using the dataset editor and added them to a custom style in Mapbox Studio. I then added the map legend and slideshow interactivity using Mapbox GL JS.
As a final touch, I chose to highlight two streets in the second slide to tell a stronger story. To do this I went back to the dataset editor to trace the streets, but this time simply copied the GeoJSON object from the editor and pasted it into my JavaScript code.
Using PaintCode and the Mapbox iOS SDK, we can create a workflow for custom markers and interface elements in an app without paying a price in performance or busywork.
Building visually rich, geographically-centered applications leads naturally to a focus on small details. Why go through the effort of designing custom UI elements and a strong brand, but then use default map markers and styles?
Recently I’ve been working with Maya on a mobile application with the newly-launched Mapbox Studio dataset editor. Designing this kind of application often starts with the UI being statically designed in an application like Sketch, Inkscape, or Illustrator. In this case, Maya used Sketch to define the look of the application.
Sketch natively supports the SVG format for vector graphics. SVG has quickly replaced raster PNG files for web app design elements due to its efficiency with flat graphics and effortless support of scaling for high-DPI displays. Unfortunately, it isn’t as easy to use an SVG file in an iOS application as it is with the web, and you can’t control aspects of SVG graphics the same ways that you can with CSS.
PaintCode then allows you to export these graphics as StyleKits: generated Objective-C or Swift code that, instead of describing shapes using SVG’s declarative paths, draws graphics dynamically using UIBezierPath and other native functionality. In this case, I’ve exported the graphics as GeoSheetStyleKit, and this marker is exposed as the method .imageOfMarker(_:), which receives a fill color and returns a UIImage instance.
funcmapView(mapView:MGLMapView,imageForAnnotationannotation:MGLAnnotation)->MGLAnnotationImage?{guardletannotation=annotationas?CustomMarkerelse{returnnil}letblueColor=UIColor(red:0.0706,green:0.6039,blue:0.9294,alpha:1.0)letyellowColor=UIColor(red:1.0000,green:0.8235,blue:0.0275,alpha:1.0)letgreenColor=UIColor(red:0.4000,green:0.7255,blue:0.4118,alpha:1.0)letredColor=UIColor(red:1.0000,green:0.4157,blue:0.0000,alpha:1.0)switchannotation.feature!.properties.state??""{case"unsafe":color=redColorreuseid="marker-red"case"damaged":color=yellowColorreuseid="marker-yellow"case"inspected":color=greenColorreuseid="marker-green"default:color=blueColorreuseid="marker-blue"}varannotationImage=mapView.dequeueReusableAnnotationImageWithIdentifier(reuseid)// if this image was cached, then we can reuse it instead of drawing// it from scratchifannotationImage==nil{// here's where we use the StyleKit to draw a new image based// on our picked colorvarimage=GeoSheetStyleKit.imageOfMarker(markerColor:color)image=image.imageWithAlignmentRectInsets(UIEdgeInsets(top:0,left:0,bottom:image.size.height/2,right:0))annotationImage=MGLAnnotationImage(image:image,reuseIdentifier:reuseid)}returnannotationImage}
This uses the custom marker image example as a starting point, and shows off some of the smart default functionality of markers on iOS - for instance, we save marker instances with a reuseIdentifier so that they’re cached rather than constantly redrawn.
Note that this uses one marker resource, and dynamically draws it with different colors.
It’s more flexible than tinting a raster image, which would tint the entire image including
the gray shadow in this one. Using this method, we can color individual parts of the image
and control other style properties as well. This also handles different pixel densities without requiring multiple versions of an image file.
If you’re feeling brave, you can also write drawing code from scratch to do something similar with iOS’s native Core Graphics functionality. The Mapbox iOS SDK’s API is flexible enough to support many new and fun possibilities. Enjoy!
Our friends Moabi are launching Map for Environment, working with OpenStreetMap to monitor the world’s natural resources, and addressing the toughest problems facing our planet.
From space, the earth’s great forests and rivers still seem intact and flowing, providing food and livelihoods to hundreds of millions, and home to thousand of species of animals. Zoom in, and the threats – logging roads, industrial agriculture, dams – to these great natural phenomena become apparent. Transparency is key to monitor and act to protect these places, yet many of these places and activities are unmapped.
Map for Environment follows the model of the Humanitarian OpenStreetMap Team and Missing Maps, who have pioneered open mapping to support humanitarian response. Map for Environment will use tools like the OpenStreetMap Tasking Manager for environmental topics and build partnerships with conservation organizations, to map the humanitarian disaster slowly unraveling in the world’s critical ecosystems.
InciWeb is an interagency government website that publishes real-time information about wildfires. It hosts an RSS feed that is updated multiple times a day with reports from agencies managing active fires.
I wrote a script that grabs the latest RSS data from InciWeb and creates a GeoJSON point for each item. The point ties a wildfire’s geographic coordinates to a list of snippets from recent reports about the fire (provided by incident-specific RSS feeds).
I was also intrigued by the perimeter shapes that I saw on InciWeb detail pages. I found this data on a REST API provided by the Geospatial Multi-Agency Coordination. Supplementing the points I derived from InciWeb’s feed with fire perimeters from GeoMAC made for a more compelling and informative visualization.
Once the data was retrieved, transformed to GeoJSON, and cross-referenced, I used the Mapbox Datasets API to upload it all into a collection of datasets.
Throughout the process of writing and testing my scripts, I used the new Mapbox Studio dataset editor to review and edit updates. The ability to see and search my uploaded data in a map-based editor proved vital while I checked that my code was working as intended.
When I was confident that the scripts were functioning correctly, I scheduled them to run regularly via AWS Lambda.
Adding style and interactions to showcase the data
Mantle, the game engine plugin for designing 3D maps in Unity, just leaked screenshots of its new 3D terrain and maps. Developers are no longer constrained to a limited geographic area, they can now import the entire world into the game.
The environments generated with Mantle are based on real-time map data from every city and place in the world. From 3D terrain and buildings, to land use like parks and beaches, to showing the streets and bridges - every detail in a city can now be loaded into Unity and styled there using Mantle.
The new plugin uses soon to be released Mapbox tilesets specifically designed for gaming. These new tilesets contain compressed elevation data and building heights. Just like the the vector tiles that power Mapbox Streets they are super light for fast loading and updated minutely. As a city changes, levels can be updated from fresh data, letting developers model their own game environment based on the latest maps.
If you want early access to the new plugin, email Sean Heffernan at heffernan@mapbox.com.
Traffic fatalities in the U.S. jumped 7.2% to 35,092 last year, the sharpest increase in about 50 years. Yesterday the NHTSA’s Fatality Analysis Reporting System (FARS) released final 2015 data on vehicle crashes in the United States. The Department of Transportation (DOT), National Highway Traffic Safety Administration (NHTSA), and the White House also issued an unprecedented call to action asking for help analyzing fatality data to find ways to prevent these tragedies:
Despite decades of safety improvements, far too many people are killed on our nation’s roads every year. Solving this problem will take teamwork, so we’re issuing a call to action and asking researchers, safety experts, data scientists, and the public to analyze the fatality data and help find ways to prevent these tragedies. — U.S. Transportation Secretary Anthony Foxx
Responding to this call, we created the map below to show where fatal crashes have occurred over the last five years. Enter your address and the destination of your commute to see fatal crashes that occurred along your route between 2011 and 2015. Toggle between additional information, such as alcohol, excessive speeding, cyclists, or pedestrians involved and compare the differences between the years.
Our Directions API and GL plugin show the outline of your route.
Expectations on data quality
A dataset of this size can be highly valuable for traffic safety campaigns, but it is only as good as its initial source. As with many aggregated government open datasets, the data collection and reporting practices of local police departments are key to maintaining a high-quality dataset. Huge jumps in numbers must be met with reservation.
Government agencies generate a lot of data. By opening up their datasets they enable a wide range of stakeholders, from nonprofits to companies like Mapbox, to develop innovative applications and services for the public.
The new Mantle plugin for Unity, which adds real world terrain and streets to games, allows full control over lighting. Being able to simulate real world lighting conditions is key for creating immersive environments. I’ve been matching sun angles, getting highly realistic shadows to go with the accurate building footprints within the Unity game engine.
Using a directional light, I was able to match the lighting in this scene to what we see in our satellite layer. On the top you can see the morning sun and shade on Fort Mason, San Francisco, as the DigitalGlobe satellite passed 2 million feet overhead. And on the bottom is Mapbox powered terrain and streets data imported into Unity via the Mantle plugin:
These buildings are imported into Unity as real geometries and integrate into my project as any other asset does. This allows me to use Unity’s lighting engine to dynamically recreate lighting to match any time of day.
Check out yesterday’s blog post on 3D fantasy environments in Unity which shows a sneak peek of the new 3D terrain and maps that Mantle is adding to Unity.
If you want early access to the new plugin, email Sean Heffernan at heffernan@mapbox.com.
We just rolled out improved support for Chinese, Japanese, and Korean (CJK) labels in Mapbox GL JS and GL Native. With our GL improvements that accommodate a larger glyph atlas and character range, non-Latin characters and place names are now rendered fully and crisply, powered by the speed of our vector tiles.
Our improvements also allow the same design control over text label widths and line breaking that we offer for Latin character sets, using line breaking and character rotation rules that cater to CJK text, as Chinese text does not use spaces for word wrapping and each character tends to have a word-like meaning. You can learn more about the intricacies of rendering glyphs on GL from Konstantin’s blog post about drawing text.
Check out the map below for the top 8 Chinese restaurants in San Francisco, showcasing our improved CJK support. Whether you’re an American tourist in Beijing, or a Chinese tourist in San Francisco, knowing where you are on the map is important, and GL JS labels are here to help. And whether you’re a tourist or a long-time resident of San Francisco, knowing the best Chinese restaurants in the Bay Area is a necessity.
We’re working on adding further support for multiple languages in GL JS and GL Native, including right to left language (bi-directional) labels, more precise vertical labels, and text that requires connections between glyphs. Both GL JS and GL Native are open-source, so we invite you to join the fun and contribute!
Last week I was honored to attend the third annual White House LGBT Tech & Innovation Briefing. Over 200 LGBTQ tech professionals convened at the White House to discuss how the tech community can tackle issues affecting LGBTQ people. The event was intentionally inclusive; Leanne Pittsford, CEO and Founder of Lesbians Who Tech and Tech Up, invited 50% women, 50% people of color, 20% transgender and gender nonconforming people - all of whom were also diverse in terms of geography, skill sets, and company/organization.
Jamesha Fisher, Security Operations Engineer at GitHub, and me hanging out before USCTO Megan Smith's keynote.
The power of an intentionally inclusive and diverse group was inspiring. At the Briefing, we broke out into sessions dedicated to issues such as criminal justice reform, health & mental health, and the environment. I participated in the session on tech hiring and inclusion since I am committed to making sure the tech industry not only hires a diverse set of people, but also actively includes everyone after hiring.
The best part is that this White House Briefing was only the first step in tackling these important issues. At the Briefing, we started planning the Tech Up Inclusion + Innovation Week Summit this November 14-20th, in Mapbox’s own backyard of Washington, D.C. I’m excited to welcome the Tech Up community to my hometown and connect everyone to the local DCTech community, including companies, startups, and tech meetups.
Interested in attending or planning this November’s Tech Up Summit? Follow @LesbianTech and @wetechup or find me on Twitter at @AlexUlsh to learn how to contribute!
We recently released our iOS SDK 3.3 with the powerful new functionality of the MGLAnnotationView class, letting developers bring in anything they can create—or have created already—in native iOS views as markers on the map. It’s as easy as assigning a geographic coordinate; the SDK handles the rest.
This feature allows for creative visualizations. Here, we vary the point size, hue, and opacity based on how close to the map center it is. There is also a mode toggle which (with a little assist from OLImageView) can let us use animated GIFs as markers!
You could do similar visualizations using movie files, custom animations, or even a live feed of the device camera.
Check out the code to the above demo, as well as our annotation view example. Then, give the SDK a test drive for your next game, social app, or data visualization.
While you’re looking, check out our new 3.4.0-alpha.3 prerelease, which improves on performance and squashes a few bugs, among other improvements.
Ushahidi has recently switched to Mapbox Streets and Satellite! Ushahidi is an influential crowdsourcing platform that allows users to gather data from a variety of devices with custom surveys and crowdsourcing tools. Ushahidi relies on quality base maps to accurately geocode reports. OpenStreetMap fills an important role by allowing Ushahidi users to contribute and edit map data that is most important for their needs. We can see that there’s clearly a larger opportunity to link Ushahidi and OpenStreetMap communities closer together.
Peka Kota is one amazing example. Supported by Rockefeller Foundation’s 100 Resilient Cities program, Ushahidi worked with a local artist collective called Hysteria to start Peka Kota. The project works with local communities in Semarang, Indonesia to map neighborhoods, community assets, and report on issues that residents care about most. Peka Kota started out with community mapping to create the base map in OpenStreetMap. Community issues can then be reported using the Ushahidi platform.
Peka Kota benefited greatly from a partnership with HOT (Humanitarian OpenStreetMap Team) in Indonesia and the GroundTruth Initiative, who helped develop activities that used community mapping as a way to convene different stakeholders to co-create important geographic data. The project is helping citizens and government transform their city from the ground up.
Members of Kolectif Hysteria work with the local government to identify areas for mapping. (Picture Credit: Kolectif Hysteria)
We’re excited to see Ushahidi work more closely with the OpenStreetMap community and look forward to support their new Mapbox maps. You can find the update to Mapbox maps in Ushahidi v3.4.7.
Interested to learn more? Visit Ushahidi.com or contact me directly for more info on OpenStreetMap.
This week, Sam, Molly, Angelina, and I represented Mapbox at the University of Wisconsin’s Cartography Lab Education Series Workshop in Madison. The series provides a regular forum for students, educators, researchers, and professionals in the community to connect, as well as share knowledge about working with mapping and GIS tools.
The topics focused on open source mapping technologies and how to use them to build different types of mapping applications for web and mobile. We also dove into the technical differences between raster and vector tiles, and discussed why the industry is moving towards vector. Finally, we were joined by UW’s own Carl Sack and John Czaplewski, who presented on using Turf.js and PostGIS in mapping applications.
Molly explaining how to make vector maps with Mapbox GL JS.
We’re thrilled to have been invited to be a part of the workshop and love connecting with students early in their careers. It was great seeing how undergrads and grads use our tools, and gave us a chance to learn about some neat mapping projects and techniques. The Cartography Lab and UW Geography Department continue to do an excellent job preparing students for a future of mapping and we are excited to support them along the way.
I just arrived in Nanjing, China to meet with developers at this year’s NingJS conference! If you’re in town for the event, join us on Sunday at 15:40 for my Build a Better App with Mapbox presentation, and learn how to incorporate maps into your mobile and web apps using Mapbox GL JS.
You can also find me and my colleagues Karen and Xinnong all weekend at the Mapbox booth. Stop by to talk about the current job openings for our growing team in Shanghai, play with some demos, and grab some stickers!
We’re in Beijing this week for the AWS Summit Beijing! On Thursday, I’ll give a keynote about how we serve fast maps in China, where the Mapbox API has gotten eight times faster since launching in the AWS Beijing region. I’ll also dive into the ways we’re using AWS to better serve Chinese users traveling abroad, and how this all ties into our global infrastructure.
With the help of Mapbox Geocoding, riders in Berlin can now find and reserve the scooter closest to them. The Mapbox Directions API then guides COUP users directly to the scooter, which they can access using their smartphone.
Anonymized location data from our mobile SDKs helps us continuously improve our maps. We use this data to discover missing streets, add turn restrictions, improve directions, and now to help our satellite team automatically correct misaligned imagery.
When overhead imagery is collected by a satellite or plane, only the sensor location and pointing orientation are known. Precise registration typically requires ground control points (GCPs): real-world, accurately measured, often purpose-built features that can be clearly identified in imagery. The known location of a GCP on the ground anchors the feature we see in the overhead image. But the quality and availability of GCPs varies widely.
Our telemetry data includes billions of anonymized longitude, latitude, and elevation measurements from travel along roads, sidewalks, and trails. Each individual ping is less accurate than a GCP, but in aggregate, the data can be used to align imagery to within a few meters or less.
How it works
Cross-correlation compares every possible planar alignment of two images to find the best-fit overlap. This example shows a Mapbox Satellite tile randomly offset, with the offset correctly identified using a convolutional technique:
This method works well on images of the same rotation and scale and is adaptable to variations in color, brightness, noise, and other characteristics.
To align imagery with telemetry, we create a density raster where each pixel’s value represents the amount of probe data in that pixel. We then identify edges where pixel values change sharply, and convolve this raster with edges derived from imagery.
Because imagery and telemetry data are tiled, we can quickly identify alignment problems across entire cities or countries. In this example, pink shows areas where our testing best predicted randomized offsets:
Stay tuned for updates as we refine this technique. If you collect satellite or aerial imagery and want to talk about using telemetry data for alignment, drop me a line at damon@mapbox.com.
Mapbox Studio provides a powerful workflow of uploading and visualizing custom datasets. But what if the data you need is so recent or niche that a dataset doesn’t exist yet? If the data does exist, what if it’s in a format that’s not easily machine-readable such as imagery? Using Mapbox Studio dataset editor drawing tools, you can create the data yourself. In the case of suburban settlements in Ulaanbaatar, the dataset editor makes it possible to import and then extract housing data from drone imagery.
The Ger district of Ulaanbaatar (Photo credit: The Guardian)
Mongolia has faced governmental, infrastructure, and environmental challenges as it grows into an industrialized nation. Even in the capital of Ulaanbataar, ad hoc settlements comprise much of the city, sometimes without water, power, or basic services. As a whole, local accountability is spotty, and urban data is hard to come by.
Drone imagery of suburban Ulaanbaatar (Source: Asia Foundation)
Bringing this drone imagery from the Asia Foundation into the new data editor, we can do visual inspection and analyses otherwise impossible in this neighborhood. In the example below, the property lines of each household are traced by hand. Having access to this data could help the city gain a better understanding of the existing settlements and plan out land and infrastructures for the future.
Extracting data in the Mapbox Studio dataset editor
Create your own dataset!
Have imagery data that needs culling, tracing, and aggregating? Try the Mapbox Studio dataset editor and share your work with us on Twitter @Mapbox!