Amazon WorkSpaces are essentially a managed virtual machine (they use the term Desktop-as-a-Service). You can choose Windows or Linux, and they appear to be doing a free offer at the moment.
AWS (Amazon Web Services) are a great tool, and very powerful and flexible, but sometimes a bit intimidating. Their help is very good, and they offer tutorials and step-by-step guides which are really useful.
I used WorkSpaces as a option for people to use QGIS on a course, but who couldn’t install QGIS on their own machine. I’ve had this issues a couple of times, sometimes because they don’t have admin rights, sometimes because the computer just doesn’t like it.
They have an ‘easy-setup’ available in some regions, and I used Europe (Ireland). Europe (London) doesn’t have this option, unfortunately. I started with a ‘Standard with Windows 10’ which has 2 vCPU, 4 GiB RAM, 80GB Root volume and 50 GB User volume (disk space). I logged into this, and installed QGIS (v 3.12). You can then create an ‘Image’ of this, which AWS then wants you to put into a bundle. The key bit is then you can launch this Bundle as many times as you want – whether it is for one user or 20 users!
After a bit of experimenting with the Standard image (2 vCPU, 4 GiB RAM) it didn’t really have enough welly for QGIS, so I upgraded it to ‘Performance’ which has 2 vCPU and 7.5 GiB RAM, which worked much more effectivley.
The user can download an app (Windows / OSX / Linux) to run their WorkSpace, or run it through the browser, which is what my participant did. It worked well, although juggling the Zoom window and the browser window took a bit of practice I think.
Cost wise, the Performance option is costed by AWS at $8/month + $0.53/hour. I’ve always found with AWS I’m not 100% sure exactly what they will charge until I get the bill at the end of the month (as this is all + VAT as well). For 1 participant using the WorkSpace for a one day course cost be about $20 (£16).
Any comments or questions are welcome. Good luck and let me know how you get on using AWS WorkSpaces!
I recently ran my ‘Introduction to Spatial Data & Using R as a GIS’ course for the NCRM at the University of Southampton. This was the first time after I had updated the material from using the SP library to using the new SF library. The SF (or Simple Features) library is a big change in how R handles spatial data.
Back in the ‘old days’, we used a package called SP to manage spatial data in R. It was initially developed in 2005, and was a very well-developed package that supported practically all GIS analysis. If you have worked with spatial data in R and used the syntax variable@data to refer to the attribute table of the spatial data, then you have used the SP package. The SP package worked well, but wasn’t 100% compatible with the R data frame, so when joining data (using merge() or match()) you had to be quite careful, and we usually joined the table of data to the variable@data element. For those in the know, it used S4 data types (something I discovered when I generated lots of error messages whilst trying to do some analysis!)
The SF library is relatively new (released Oct 2016) and uses the OGC (Open Geospatial Consortium) defined standard of Simple Features (which is also an ISO standard). This is a standardised way of recording and structuring spatial data, used by nearly every piece of software that handles spatial data. Using SF also allows us to work with the tidyverse series of packages which have become very popular, driven by growth in data science. Previously, tidyverse expected spatial data to be a data frame, which the SP data formats were not, and often created some interesting error messages!
The Geospatial Training Solutions ‘Introduction to R’ course is very well established, and I have delivered it 14 times to 219 students! However, it was due for a bit of a re-write, so I took the opportunity of moving from SP to SF to do restructure some of the material. I also changed from using the base R plot commands to using the tmap library. As a result, it is now much easier to get a map from R. In fact, one of the participants from my recent NCRM course in Southampton said:
“It was so quick to create a map in R, I thought it would be harder.”
Participant on Introduction to Spatial Data & Using R as a GIS, 27th March 2019, University of Southampton
They were blown away by how easy it was to create a map in R. With SF and tmap, you can get a map out in 2 lines (anything staring with # is a comment):
LSOA <- st_read("england_lsoa_2011.shp") #read the shapefile qtm(LSOA) #plot the map
You can also get a nice looking finished map with customised colours and classification very easily:
tm_shape(LSOA) + tm_polygons("Age00to04", title = "Aged 0 to 4", palette = "Greens", style = "jenks") + tm_layout(legend.title.size = 0.8)
However, unfortunately not all spatial analysis is yet supported in SF. This will come with time, as the functions develop and more features are added. In the practical I get the participants to do some Point in Polygon analysis, where they overlay some crime points (from data.police.uk/data) with some LSOA boundaries. I couldn’t find out how to do a working point in polygon analysis* using this data and the SF library, so I kept my existing SP code to do this. This was also a useful pedagogical (teaching) opportunity to explain about SF and SP, as students are likely to come across both types of code!
*I know theoretically it should be possible to do a point-in-polygon with SF (there aremanyposts) but I failed to get my data to work with this. I need to have more of an experiment to see if I can get it working – if you would like to have a try with my data, please do!
The next course I am running is in Glasgow on 12th – 14th June where we will cover Introduction to Spatial Data & Using R as a GIS, alongside a range of other material over 3 days. Find out more info or sign up.
I spent a great couple of days up in Liverpool, attending the North West Digital Research Methods Festival at the University of Liverpool. It was great to be back in Liverpool and catch-up with colleagues and friends from my post-doc days there in 2013-16. The city has changed quite a bit, and my old office now overlooks a major building site instead of a green park!
The conference looked at Digital Methods from a broad social science point of view. It was great to spend some time thinking about digital methods from a different perspective. Key to all digital methods are longevity and there were lots of discussions about how data resources are made available to scholars in the long term; including decisions made to simplify a website interface to ensure it will remain working for longer with limited support.
It also made me think about how we process data. Warren Pearce presented on social media data and was critiquing the fact that we often focus on the text content of messages, and ignore the visual elements. This is missing out on a key element of the conversation (think of any social media content you have recently looked at) and the visual elements should be included in the analysis. My initial thought was that this was a technological hangover, with text being much easier to process than visual. However, I learnt that there is also a cultural element with text based information being seen as much more valuable than pictorial information. Warren also highlighted a fascinating visualization of the front pages of the New York Times, highlighting how it had changed from just text to a mixture of text and black & white images, then to text & colour images. Warren’s recent paper on the topic is at https://www.tandfonline.com/doi/full/10.1080/1369118X.2018.1486871
There were a whole range of presentations looking the digital research and digital data, from a wide range of different perspectives. These included using physical objects to encourage interaction and engagement in a museum environment, to considering the best ways of increasing accessibility of digital archives such as photo libraries of African Rock Art or historical criminal life courses. Have a look at https://twitter.com/hashtag/nwdrm for Tweets from the conference.
The second day consisted of a series of practical workshops, which included one run by me on GIS. I was pitching GIS as a great digital method and I think I may have converted some people!
I would really recommend that everyone considers attending conferences outside of your usual ‘academic sphere’ – you never know what you are going to see, what ideas might be sparked off, or what future contacts & employers you could be meeting!
Earlier this week I have a very nice couple of sunny days in London attending a training course and a conference. It’s a nice change to attend a course (rather than delivering one!) and is also a great opportunity to add to my CPD log (particularly important for my Chartered Geographer status with RGS-IBG).
Some of my transport around London!
On the Monday I attended a half-day workshop on Linked Data, organised by Dr Claire Ellul at UCL and run by Bart De Lathouwer from the Open Geospatial Consortium. I’d come across the term linked data in various different situations, but hadn’t really done much with it, and this was a great opportunity to learn about it. The key bit about linked data is that it is solely formed from triples, sets of three, in the form “subject, predicate, object” such as “The pool – is – blue” or “student – name – value”. It also is a fundamentally different way of structuring data from a “traditional” relational database and so avoids many of the limitations, but also requires a completely different way of thinking about the data. This is quite a jump from what we are used to, and I think it will take a little while for linked data to properly take off. This is a good resource (http://www.opengeospatial.org/blog/1673) for some information on how OGC are working with Linked Data.
Queen Elizabeth II Centre, home for ESRI Annual Conference
On Tuesday it was ESRI UK’s Annual Conference, based at the Queen Elizabeth II Centre near Parliament. It was a great conference, with a massive range of examples of how ESRI’s various different products could be used. There were some great examples of using Strava data to help Jersey understand cycle route usage across the island; using this data to identify and remove bottlenecks in their infrastructure. We also had a presentation on how City Engine was used by Disney to help them develop the city behind the film Zootropolis (2016), allowing them flexibility to create and tweak a whole city design with limited time and resources.
A good turn out for the conference!
Unsurprisingly a significant chunk of content was on conversion from ArcMap to ArcGIS Pro, their new flagship product. There is a big focus around users having an identity and using this to both access local and remote resources for ArcGIS Pro (including, no doubt, an element of licensing). There was also a reasonably strong theme about pushing out GIS to non-GIS users, and making it easy to use for new-comers, particularly with the development of ArcGIS Pro which, for example, automatically includes a base map when you start a new project. Possibly not ground-breaking for regular users of GIS, but a big help to someone coming to GIS cold – now they have a map they can add their data to, rather than just a big blank space (when you start ArcMap).
If you would like a chat about getting more from your GIS (ESRI or other packages!), or GIS Training for small groups, please do email email@example.com or give me a call on 01209 808910.
Over three days in January, Nick ran a series of one day GIS training sessions for the ADRC-E at the University of Southampton. The courses covered a whole range of GIS skills including understanding spatial data, finding GIS data, working with QGIS & R, and spatial analysis in GeoDa & R. The course participants came from a wide variety of backgrounds including PhD students; academics; health; economics; business intelligence and national statistics.
As well as plotting data on a map, the courses also covered more advanced spatial analysis, looking at buffers, spatial overlays, spatial decision making and spatial statistics. This allowed participants to get the most from their spatial data and use it in their future work.
GIS is a fantastic tool and something that can be applied in many different settings. Nick’s up-to-date knowledge and experience provides course attendees with the know-how needed to evaluate their own data, to create maps and perform the analysis within their workplace.
Photo credit: ADRC-E
“I enjoyed the focus on practical exercises – very useful! Excellent content for intro course.” course attendee, Introduction to QGIS: Understanding and Presenting Spatial Data, 15th January 2018.
We run courses across the UK, our training page provides details of our upcoming courses. If one-to-one GIS training would be useful for you or members of staff in your organisation, please have a look at our brochure or get in touch to find out more about our tailored courses for all skill levels.
During a warm week in July, I spent three days at UCL in London running GIS courses in conjunction with Clear Mapping Co, the ADRC-E (Administrative Data Research Centre for England) and the CDRC (Consumer Data Research Centre). We ran three one day courses, developing the courses we had run at UCL in February. It was great to come back and increase the number of people who could benefit from using GIS and spatial data in their work.
We had a wide range of participants, from PhD students and researchers, to those working in Government, charities and a wide variety of other applications. We even had someone who was making the leap from working for a large commercial company to going freelance at the end of July – good luck!
Our colouring in exercise was a great success and really got the students thinking about how we choose the colours we use on a choropleth map, as well as how we select the classification boundaries for the data. We gave the students one data set, and the 20 students created 20 different maps. The lesson was to make sure you think about which colours and classifications you choose – don’t just stick with the defaults your GIS program gives you. They are always not the best!
It’s always great teaching GIS to people who haven’t used it before. There is so much potential with spatial data; for more information about the GIS courses we can offer and how GIS could be useful for you, take a look at our ISSUU or get in contact with Nick who will be able to develop a bespoke course suited to your requirements. Email Nick at firstname.lastname@example.org, or call 01326 337072.
Over a fresh, sunny three days in early January, I joined 17 academic writers at Dartington Hall, Totnes for a writing retreat. What is a writing retreat, I hear you ask? Well, for academics working at a university, one of the key ways of conveying findings from their research is by writing papers that are published in academic journals. Writing these papers (often 5000 – 8000 words long) is a very time intensive task and also often key to promotion up through the university structure.
Writing a paper can be a lonely task and is often something that gets pushed down people’s to do lists, because usually there are no specific deadlines, other than the ones you invent yourself (which are easily changed!). At the session we had participants with a range of experience, from PhD students writing their first or second paper, to experienced academics writing their thirtieth paper!
We used the opportunity to support each other by sharing ideas about writing processes, where to start and how to make the best use of the time available. We also had dedicated writing sessions (either 60 or 90 minutes) where we worked in the same room on our individual papers. This was a very new experience for me, and the “peer” pressure of everyone else writing (and not checking emails, Facebook etc.) for a specific period worked very well.
I was writing up an article on how we can ensure research is reproducible, using our recent PopChange project as an example. I hope to be presenting the research at the GISRUK conference in Manchester in March and will be submitting the paper for publication soon after!
My thanks go to Sarah Dyer and Dave Simm of the Higher Education Research Group for the Royal Geographical Society who organised the writing retreat and made sure everything ran to plan.
I’ve just been through the process of contributing to the source code of a package in R (in a very small way) so here’s a short piece on how easy it was, and why anyone can do it! I originally wrote this post in August last year, but waited to post it until the new version of maptools was released. I missed this (we are now at 0.8-39!) and have only just rediscovered this post. It’s all still relevant though!
I have been using the Maptools library extensively in my use of R as a GIS, as well as in my teaching material (hosted at https://github.com/nickbearman/intro-r-spatial-analysis). The default plot order in the legend is to have the darkest colour at the bottom of the legend, and the lightest colour at the top. This was just something I accepted, and to be honest, never really thought about before.
I recently delivered a training course on R to some staff at the ONS (Office for National Statistics, England & Wales) and they said that their best practice guidelines are to have the darkest colour at the top of the legend. They asked me how to do this, which I didn’t know!
After some fiddling about with an R script, I created a version which worked for them. I then thought it might be useful to integrate this into the Maptools library, and emailed the package author, Roger Bivand. He was very helpful, and I added the additional code to the sourcefiles for Maptools. These are now avaliable in version 0.8-37 (or later), which has recently be released. Running update.packages(“maptools”) should get you the new version.
To reverse the colours is a simple matter of changing the legend code in two places. Using the example from the helpfile, the original line:
This week I have run two courses on ‘Introduction to Using R for Spatial Analysis’ which have been very successful. Both courses sold out, with 15 people attending in Liverpool and 20 in London. We had people with a wide range of GIS and R experience, ranging from no experience in either GIS or R, to significant experience in one but little in the other.
We covered the basics of using R through the RStudio interface, which I find makes R easier to understand for newbies! I certainly found it much easier to learn R using RStudio, and still use it everyday for my R work (I’ve opened the native R interface maybe twice since I started using it!). We also looked projections and coordinate systems (which were at the bottom of a GIS problem a colleague had today) and at spatial data representation, particularly how to create a representative, truthful choropleth map, and I made use of a blog post about this very issue, which I recently tweeted.
We also had a number of very interesting discussions about the pros and cons of R vs other GIS software, such as ArcGIS or QGIS, as well as other languages, such as Python. Each has their own pros and cons, and in my work I regularly use a mix of these, depending on what I am trying to achieve.
I am also in the process of developing an intermediate course that will focus more on spatial analysis. If you are interested in finding out more about when either the basic or the intermediate courses will be run again, please send me a message (using the contact form on this site) and I will add you to a list to hear about future courses.
All of the material from this course is freely available, and hosted on GitHub. Head over to http://github.com/nickbearman/intro-r-spatial-analysis and you can view the material yourself and work through it at your own pace. You can even use it to contribute to new teaching material, and if you do, please also make your material available through Creative Commons so others can benefit from it as well.
On Friday 23rd January 2015, I ran a one day workshop on an Introduction to Using R for Spatial Analysis. We had 18 participants (thanks for squeezing in, everyone!) from a wide variety of backgrounds in R, from never having used R to using R relatively regularly, but not used it as a GIS. The course ran really well, and I was very happy with it, given that it was the first time I had run this course in this format. If you are interested in attending this course in the future, please send me a message (using the contact form on this site) and I will add you to a list to hear about future courses.
I’ve attached the materials I used to this blog post (see below). My material available under the Creative Commons Attribution-ShareAlike 4.0 International License (seehttp://creativecommons.org/licenses/by-sa/4.0/deed.en for details), which means that the material I created for this training session is free for anyone to use, as long as you attribute the material to me, and make any material you derive from this available under the same license. I would also ask you to let me know when you use my material, as it’s useful for me to know how many people are using it, and what sort of courses they are using it for.