I recently received an email from Google Webmaster Tools stating that it couldn’t access some of the content I had submitted in my sitemaps. When I loaded up the sitemap that was causing the issue, I found a 404 page greeting me. I recently moved my blog from a subdirectory up to the main directory and I figured that this was probably the reason behind this 404. The For WordPress blogs I’ve always the Google XML Sitemaps plugin to handle the generation of sitemaps and submission of them to the various search engines. I figured that the easiest thing to do would be to delete the current sitemaps and regenerate them. I went searching through the directory containing this site but could not find any sitemaps anywhere. Strange right? A quick Google search also showed other people having this issue but no response form the author on what was going on. The sitemaps were clearly being generated, as I could access some of them. Deleting the plugin removed all those sitemaps from being access all together. At this point I sort of lost faith in the Google XML Sitemaps plugin and decided to look elsewhere.
Step in Yoast SEO
The Yoast SEO sitemaps feature is a little less customisable than Google XML Sitemaps, but it hasn’t created me any 404 pages. Which is the ultimate goal here. I still can’t seem to find where my sitemaps are actually stored on my server. Performing a regex search yields no results. Also, due to my blog not being in the root directory I’ve had to edit my .htaccess file to get the plugin to actually generate the sitemaps, which is pretty lame if you ask me. If you’re having the same problem you need to add the following before this line “# BEGIN WordPress“;
# WordPress SEO - XML Sitemap Rewrite Fix RewriteEngine On RewriteBase / RewriteRule ^sitemap_index.xml$ /index.php?sitemap=1 [L] RewriteRule ^locations.kml$ /index.php?sitemap=wpseo_local_kml [L] RewriteRule ^geo_sitemap.xml$ /index.php?sitemap=geo [L] RewriteRule ^([^/]+?)-sitemap([0-9]+)?.xml$ /index.php?sitemap=$1&sitemap_n=$2 [L] RewriteRule ^([a-z]+)?-?sitemap.xsl$ /index.php?xsl=$1 [L] # END WordPress SEO - XML Sitemap Rewrite Fix<br>
I’ve always been one for displaying data in various different ways, today I’ve been messing around with the location data Google stores from my mobile phone.
The ability to display this data came from /u/snowstorm99 (original thread), snowstorm99 put together a handy guide on to how to make this yourself, however after running through this myself I found a few issues with it. I thought I’d write my own guide on how to create it for an absolute beginner.
Creating Your Own Location Map
First of all you are going to need to install the Python language onto your computer. Version 2.7.8 appears to work well with this so grab the download from here: https://www.python.org/download/releases/2.7.8/
I would strongly suggest installing Python directly to your C Drive (C:\Python27), I originally had my installation elsewhere and it appeared to cause problems.
Secondly you will need to install PIL (Python Imaging Library). Later versions of Python already come packaged with PIL, but I’m not sure if they work with the code we are going to later on. Feel free to try out and comment below if you find success. When you run the installer, it should pick up on where you have installed Python. I’d suggest leaving everything as default.
Now to download the map that you’re going to be plotting your data on. I’ve uploaded a copy of a UK map to here: http://stuff.joshjordan.co.uk/ImageAndData.zip If you’re front outside the UK then you’ll need to find a map of the area you’d like to use. I can’t just be any old map though you need to have the Image data that goes with it.
Now to download you’re location data, this is the last download I swear! Head on over to Google Takeout: https://www.google.com/settings/takeout. Make sure you’re signed in to your google account and just select ‘Location History’. Once this is downloaded you’re good to go!
Create a new folder in the directory that you installed Python in, for me it was as follows: C:\Python27\Location (I created the Location folder).
Copy all the files you downloaded into that folder. Make sure to extract all of the files form the zip folder that you have into this location as well.
Hold the shift key and right click in the Location folder, click “Open command window here”
Copy in the following: “location_history_json_converter.py LocationHistory.json LocationHistory.kml” and press enter. Now depending on how much location data you have, this may take a while. For me it only took around 30 seconds. The speed will also depend on how fast your computer is, so be patient if you’re running on an old system!
You should now see a file called LocationHistory.kml. If not try re-running the the above command again.
Now type the following into the command window: “LocationHistoryPlotter.py”. You should see something like below, this will go on for a bit.
Once the above has completed running, you should find another new file has appeared (try pressing F5 to refresh if it hasn’t). The file should be called “LatitudeData.png”. This file should contain a map with lines drawn all over it. Red lines indicate most recent data, yellow lines indicate older data.
Hopefully you’ve been able to follow the guide through and have yourself a visual representation of your location data. If you’ve run into any issues along the way, feel free to leave a comment and I’ll try to help out where I can. The most common problem I find is that your computer may need to be restarted after step 2. This is because Python adds data into your Windows Environment Path variable, which doesn’t work until you restart explorer.exe and the easiest way for most people to do that is a restart.
If this is too technical for you and you just want to see a map with your location data on it now, then head on over to here: https://maps.google.co.uk/locationhistory. It only displays your last 30 days of data, and doesn’t look quite as nice. But it’s something!
I discovered Geeklist today and it’s amazing. Imagine a combination of StackOverflow, GitHub and Facebook all mashed together to make one geeky awesome site. Signing up to Geeklist allows you to view all sorts of communities based around various geeky developer things, for me I’m mainly looking at this as a resources site and possibly an idea springboard projects.
For instance one of the communities I joined was the JQuery community. It’s full of people talking about their dealings with JQuery, their projects, handy code snippets and links to various JQuery resources. There is tons of really good stuff in there and you can often find some real gems, such as this really handy JQuery Highcharts Cheatsheet. Yes it is rather specific to Highcharts, but if you know what you’re looking for then you’re going to find lots of good stuff.
I think this is a really good idea for a website, I’ve often found that the social networking sites I am apart of now don’t tailer for such things. If I went on Facebook and posted that Highcharts link it would get ignored and people would wonder what on earth I was on about. Google+ sort of started this with it’s own communities, but I’ve often found that they are either dead or content moves so quickly that you don’t have a chance to read it all.
If you’re a developer or just like reading up on techy things, then I would strongly suggest giving Geeklist a look, and of course feel free to add me on there!
Total Number of #WengerOut tweets per day
Top Countries using the #WengerOut hashtag
Top Cities using the #WengerOut hashtag
Total Number Of #WengerOut Tweets Per Day
I felt the best way to display this data was to use a simple line chart. This allows viewers to quickly understand what the data is showing.
Top Countries using the #WengerOut hashtag
Displaying this data was a little more tricky then calculating the number of tweets per day. This is due to Twitter not enforcing each tweet to have a location attached to it (and for good reason). Meaning that I can only use the WengerOut tweets that have a geo location attached. I could have used another line chart to display this data but I felt that with a possible 196 countries tweeting, the line chart could get very confusing. Therefore I decided to opt for a pie chart (everyone loves pie charts). Again I would have the issue of a possible 196 segments of the pie, therefore I have limited the pie chart to the top 5 countries. I could have done this with the line chart but a pie chart makes comparing data a lot easier in this scenario. You can see in the below imagine one limitation with ChartJS, the smallest segment is not big enough to contain the text overlay.
Top Cities using the #WengerOut hashtag
Once again I could only use the data with a geo location attached to it. Unlike countries, Twitter does not seem the verify the data stored in the city/town field of the JSON response. Which is why you sometimes end up with “Nigeria” in the city segment of the data. I could have created a separate database table with a list of countries in it and perform a search and compare against each result, that would have added overhead to the computation of the queries and slowed the loading time of the webpage. Further to this I wanted to point out that this is an issue with the current version of the Twitter API.
As well as displaying the data in the above ways, I also played around with the Open Street Maps API. I wanted to see if it was possible to plot the tweets on a map using the data that I obtained from Twitter. I found the Open Street Maps API to be a little confusing to use at first and slightly clunky, my next foray into mapping will be using Google Maps as I’ve heard lots of good things about that. One issue I found is that the API does not like it when you use custom icons for plotting data. In the below image you can see two different icons being used, the red icon is nowhere to be found on my server. It is being taken from the Open Street Maps API and I am not entirely sure why… Another issue I find is that you can’t just plot a point on the map, you have to go through Open Street maps API to convert a location into something the map can understand, this means that for each tweet I want to plot I have to send it’s geolocation to an external server and then wait for the response before I can plot it. I currently only plot the top 100 tweets in the database but even that is incredibly slow. If I wanted to plot all tweets I’ve collected then I wouldn’t like to think how long that would take. Link to WengerOut: http://wengerout.joshjordan.co.uk
Today I discovered Pushbullet and thought I would share as it’s going to change the way I communicate between my phone and PC from now on.
Every time I’ve wanted to send something from my phone to my PC (or vice versa), I’ve had to either upload the file to Dropbox/Google Drive or email myself with the information. No longer do I have to do this fiddly way of sharing information with myself! Pushbullet makes getting information off and on your phone an easy task, a simple task; like how it should be. Using push notifications via your Google account you can easily share; links, quotes, pictures, notes and more between your PC and any Android device.
Without this sounding more and more like a sales pitch, I strongly suggest you at least check it out and see for yourself, With a nice looking API and Tasker support, this application is pretty powerful.
I’ve been working with the Twitter Streaming API once more, not on the Twitter Stats project I was working on a few months back but a new project. This one uses the Twitter Streaming API to gather real-time information about particular hashtags, words and any other data you can think of.
There really isn’t that much documentation for how to get started with the Twitter Streaming API so I ended up spending a lot of time writing trial and error code. Once I managed to figure it out and gain a basic understanding of how to consume data and keep the connections open I found that it was pretty much the same as using the REST API. Soon after I discovered Phirehose which is a really handy interface designed to help connect to the Twitter Streaming API, using this already written interface allowed me to focus on the next part of the project rather than creating the interface myself. I think that once I have a basic version of the project finished I will revisit this and re-write the interface and tailor it for my needs.
Using the filter-track.php class that comes with Phirehose I’ve been able to consume from the API and pull all sorts of data. I was surprised how quickly the data is pulled from Twitter, it appears to be pretty much instant. Now that I have the data being consumed the next step is to think about storing it in a database and then having a front end display it in an interesting format.
Today I spent the majority of my evening looking up how I can improve the SEO of Tumblr. Long story short, you can’t do a huge amount without going into the HTML.
There are some themes out there which help improve your Tumblr SEO by generating relative meta tags for each post and assigning your relative author link to posts. The only issue I found with these themes was that I couldn’t find one that I liked the look of. I found this handy guide which suggested some improvements. I’ve edited the code now and shall report back in about a week to see if it has improved anything.
One thing that a lot of people suggest to do is submit a sitemap to Google, this is something I’ve always done with my website in the past but had to use an external website to generate for me (the task of doing it yourself is to laborious). It turns out that Tumblr generates it’s own sitemap, problem solved! You can find it at yourtumblr.tumblr.com/sitemap.xml. This default sitemap should have links to others (sitemap1.xml, sitemap2.xml etc) which will contain all your posts!
In the meantime I’ve changed theme yet again, hopefully this one will stick…once I change the colours from the default setting…
One week on:
If you can’t tell I’m using WordPress for my blog again. Switching to Tumblr was a disastrous move form my sites rankings on Google. Before I switched to Tumblr I had at least half the front page when you searched for the term ‘Josh Jordan’. Now I’m not on the front page at all. So yeah, Tumblr SEO = not good!
Project 9-Volt is the game development project that I’ve been working on for the last few weeks. If you didn’t already know, I love lighting in games. I wanted to put together some mechanics for a game that would use lighting as it’s core mechanic. The Project 9-Volt video in this post aims to show off some of the features I’ve implemented into a little project.
This tech demo aims to show off some of the features that I have implemented so far. A full list of features implemented in Project 9-Volt so far are as follows:
- Dynamic Lighting
- Torch/Flashlight shaking
- Different lighting colours
- Player movement
- A* Pathfinding
- Tiled maps
At the moment this is just a bunch of ideas that are slowly starting to come together to form something bigger. I hope to be able to work on this from time to get something solid together.
I’d also like to say a big thank you to Escape Theory from http://escapetheory.tumblr.com/ for undertaking the stressful task of video editing with me in the room.
After three years of studying computer science at Aberystwyth University, my time in the tiny remote town has come to an end. I felt that instead of creating the usual social update on Facebook or Twitter stating how much I would miss the town, I would instead transform my thoughts into a video. Hopefully you will be able to gain a sense of my love for the town.
I really need to get back into the habit of writing updates. It’s going to help a lot when it comes to starting the write up for this project.
Most recently I have been working on the game AI. I have pathfinding implemented in a basic form and I wish to upgrade this to A* in the future. When I first tried to implement A* over Christmas it went horribly wrong, I didn’t really understand what I was doing. I feel that I have better grasp on how that works now so I feel it is achievable.
I have also been working towards some sort of genetics for the AI. This means that the AI will effectively evolve during the time you play the game, the longer you play the more evolved the AI will get. The code for this is very rough at the moment but I do plan to expand upon it greatly as I feel it makes for interesting and unique game play.