To understand what a dedicated group this is, you should probably also check out the current weather in Portland, which is — to say the least — spectacular.
Events like this (in foul weather or fair) are really exciting. You’d think there would be more opportunities for journalists and technologists to not only spend time together, but also trade knowledge and work together. But really opportunities like these (open, learning, semi-structured events with lots of project time) are pretty rare, and I’m grateful to be involved this weekend. I’m also psyched to be working with two journalists: Evonne Benedict of King5 TV in Seattle and Rachel Alexander of Whitman College and Union-Bulletin News. Much fun!
… and apparently were confined to the part of the screen containing the ad.
We’ll probably never know if the Grist phenomenon was of this sort, but I think it might be worth developing some sort of detector for botnets of this type if the possibility exists that they are affecting more than the small number that Spider.io’s report implies are affected by Chameleon. It would seem to me that botnets of this sort have both an incentive and a disincentive to include non-target sites on their hitlist. The incentive is simply that by including legitimate targets they obfuscate their scheme from advertisers to some extent (though it’s unclear if most advertisers directly review the distribution patterns of their ads through networks.) The disincentive is that targeting legitimate sites carries a risk of detection, though most sites would probably not notice if this were to start happening.
This was of great interest to me (and should really be to anyone who runs a site with significant traffic) because it’s the first public announcement of a botnet capable of running a complete client stack.
I would think that some analytics and advertising platforms like Google would be interested in understanding phenomena of this type better –I’d appreciate any links or info regarding countermeasures or detection of stuff like this.
Twitter announced new API policies for version 1.1 of their API today. The announcement was accompanied by a diagram which IMHO was bit hard to understand at first and caused a bit of useless debate and worry on Twitter. Here’s my dumbed-down version of the diagram, or at least my understanding of it. The x axis represents who your application is for: the general public or, for lack of a better word, nerds (developers, business owners etc.) The y axis represents what your application allows those people to do: either count stuff (tweets, links etc etc.) or do stuff (tweet, search, etc etc)
Here’s Twitter’s (better) version of this:
This weekend the some of team Grist will be in SF for WordCamp. I’m really excited to get to give a talk about our journey to becoming a WordPress operation. This year (in fact, almost exactly this year, as WCSF11 represented a bit of an introduction to WordPress for us) has been a huge adventure — we learned lots about the WordPress API, moved our content and hosting to a new platform, adopted a new operating model, developed a theme and began to seriously grow our audience.
Grist’s former web hardware arrived at the office today.
So I needed to set up my OSX rig to access AWS, spin up and configure an Ubuntu instance, install Apache, PHP, MongoDB and do various other tasks. Good thing I found these two great resources:
Fist, here’s Robert Sosinki with a great guide on how to get set up with the EC2 command line tools on Mac OSX. Really clear and well done.
Next, here’s a quick guide from RSM on how to turn that brand new instance into a full LAMP (that’s Linux, Apache, Mongo, PHP) stack … though really you could install whatever packages you need.
What you drink if you are a Boca fan.