I have written about the ubiquity of Python before and I’ve also given a talk in that direction at this year’s FOSSGIS conference (“Python as ‘glue’ in the GIS software domain: Sun glare analysis of road traffic accidents”). There is also a video of my talk here (but beware, it’s in German).

So, this post by Tal Yarkoni got me interested. Tal’s post is titled The homogenization of scientific computing, or why Python is steadily eating other languages’ lunch. It looks as if Tal’s been using a similar software stack as I do with extensive use of Python and R, some JavaScript and some forays into other languages. However, in Tal’s case that’s changed: he found himself using more and more Python, especially with packages such as NumPy/SciPy, pandas, statsmodels, MatPlotLib, scikit-learn and more recently, seaborn and bokeh. (These latter ones are promising visualization libraries.)
Especially in my work for the OII, but more recently also more often with EBP, I have been using R quite a lot. I like it primarily for its versatility and its large number of useful packages (such as plyr, Hmisc, relaimpo, car, ggplot2, etc.). However, I must concede that I’m quicker with Python for many tasks, possibly just because I use Python more often. Tal’s post has inspired me to venture further in my Python coding and to look at Python also for statistical modelling (an area where I typically used R until now). More on that hopefully soon.
What languages and software are you using for your data analysis and visualization needs?