hobbies & projects. cars, watches, drones, code, whatever.
Quadcopters, tricopters, drones, FPV, etc. Been playing with all manner of these for a while.
Stuff coming one of these days. For now, here's a video:
How I learned to stop worrying about RSS and love the Twitter.
I get most of my web content through Twitter, but had a full RSS setup going as well. Google Reader's imminent shutdown threw me into an existential panic. How would I stay on top of my highly-organized tree of RSS feeds? A few days' survey showed me just how many reading apps are out there right now, RSS and otherwise. It also gave me a chance to think about whether I still cared if I read every single post. Did I ever really empty my Google Reader inbox? Nope.
I think many of us consume content in three distinct modes.
We look for headlines, snippets, and small nuggets of content that help us keep up on major developments of the day. Real-world news items, tech developments, the latest time-wasting meme, and other random tidbits that we find worth a glance. Scanning is us keeping on top of our stream. We do it nearly continuously, on whatever device we are primarily engaged with at that point in the day. We glance sideways at our phones while in line for coffee, notifications pop up on our laptop screens all day, we skim our email while walking to the shower.
Sometimes, because we still appreciate long-form content, we actually read something for more than a few seconds. This is a more deliberate activity, done in distinct chunks of time. We read interesting thought pieces on a tablet before falling asleep, we procrastinate in our browsers while temporarily stuck on a problem we're working on, we time-shift content to times when we're ready to think.
Because we're knowledge workers, we do need to do our homework. We scour the web for content relavant to our particular areas of interest. We save this content so we can keep track of it later. We're usually pretty anal about the system we use for doing this.
Coming to terms with these three modes has made it easier for me to get my head around my own content consumption patterns and then help me figure out a tools workflow that works for them.
Here's a recipe, YMMV.
I ditched everything but Twitter and email. Almost anything I ever followed via RSS is on Twitter as well, often with more useful headline commentary. Everything else I was using RSS for, like keeping up with Google alerts, I can get via email. Twitter Lists, and the fact that you don't have to Follow people/brands/cyborgs to put them in a List, means you can get as segmented as you like.
First, separate Scan and Consier content at the source level. For Twitter, anything you want to make sure you can scan, Follow it. For email, let it come to your Inbox. You can then keep your Inbox and your main Twitter timeline open as often as you like, Growling at you all day if you want, and make sure you see what's going on.
For sources you know you don't really need to scan but will instead want to read when you have time (Consider), use Twitter Lists and auto-archiving GMail labels. Got some time, looking for something to read up on, that's when you browse those. This is akin to how many of us have used RSS feeds in the past.
To move items you've scanned into a consider mode, use a read-later service. The feed from that service you can then read right along with the rest of your consider content.
Forget the social aspects, use a bookmarking service to keep track of all that web-based content you think you might want later. I have tons of projects going on at all times, I have bookmarks in all of those areas, and many more.
I use Pinboard for this. I used to use Delicious, but, well, you know. I don't use Evernote, mostly because I think its inefficient for web links and overkill for notes. Don't stress about the tags as long as they make sense to you. (As an aside, Note and TODO management is a topic in and of itself that we'll talk about another time, short version is use Simplenote and a good fast client like Notational Velocity).
Make a structured schema when you organize your Twitter Lists (or even RSS folders) -- define your lists/folders/whatever and write a few words about what they mean. It may seem like overkill to start, but over time, as you build up lots of content and are constantly curating it, it helps to clarify the categories. This also makes it easier to avoid having sources in multiple categories, which makes managing read/unread state easier with clients that don't handle it very well (like most Twitter clients).
Get your apps and bookmarklets humming. Being able to easily time-shift content or save it for rsearch purposes right from your reader is key. I use Chrome and Android. On the desktop, I find the HootSuite Hootlet, Instapaper bookmarklet, Pinboard+, and Pinboard (which feels like the old/awesome Delicious Firefox sidebar) extensions work well. On Android, its Pindroid, Instapaper, and HootSuite.
Keep track of your shared links. Pinboard has a nice feature for this, where it can automatically tag any link you tweet into your bookmarks (using a non-customizeable-but-still-ok "from twitter" tag). It can do the same for Instapaper.
A few common use-case examples...
To keep up with what specific people I'm interested in are up to, I put them in a "peeps" Twitter List, and Follow those I'm really interested in.
To get the latest tech news headlines throughout the day, I follow a number of key sources and have Growls firing all day. If I see something I'm interested in reading later, I'll use Instapaper to save it. I have in a "technews-general" Twitter List for those times when I want something more to surf.
To give myself a good list of stuff to read in a few different personal-interest areas, I put a bunch of niche sources into various Twitter Lists but don't follow them. I use my reader to check in when I've got time and am interested in checking out some content (with a tablet in bed is a good example). I'll use HootSuite for this.
I get other non-Twitter longish-form content in my inbox, like LinkedIn digest emails, and have them labeled but left in my Inbox. I usually read them in the mornings.
There are a couple of things I'm still not crazy about.
My feed-reading experience is still too much like reading a feed. I love Instapaper's "beautiful reading room", but I want that for my feeds, not just for content I send into it. I would use Flipboard if it had a desktop/web version, or Feedly if I could read Twitter lists with it (without maintaining RSS API URLs).
I'm keeping an eye on the social recommendations services, they're getting better all the time. I like Prismatic a lot for quality, if only I could get it in some sort of non-site-based channel. Flipboard, Zite, and others have decent options here as well, but I find them all too cumbersome still to include in my main workflow. I wish there were more players trying to provide the recommendations stream and not also the reader experience -- they are different things entirely and I'm not sure one app can really knock it out of the park on both.
I've been obsessing of late over deploy process stuff, as I'm working on a new stack that involves Heroku and a few other things that are somewhat new to me. My friend Chris just mentioned off-hand he wished there were a way to auto on-call someone in the hour after they push code. Yeah, that. So for the idea pile...hook into your deploy script:
I need to try this...
I work with a cool and adventurous group of folks. Though all notable, of relevance here are those that form the Clearspring Motor Club. Yes, it in fact consists of current employees, non-employees, and former employees, but alas, the unifying spirit lives on. Motorcycle rides in the VA countryside, karting, and yes, car racing. Last year, the team did several "crapcan" races, to varying levels of success but with a consistent level of enjoyment. I did one of those -- the DC-area 24 Hours of Lemons race known as Capitol Offense. Great name. We just did it again this past weekend, here's how it went.
We run a '94 Volvo 940 wagon. Its gutted, with a full custom roll cage and lots of other critical modifications, like having its rear roof removed. Scoff though you may, it is super reliable, durable, and with significant lightening, tire, brake, and suspension work, became a formidable racer. Last year, the car was decked out in Cobra Kai livery -- you know, the bad dojo, mercy is for the weak, etc. Full costumes, flat black, cobra on the hood, it worked out well.
(more pics of the Cobra Kai theme at the 2011 race here)
But this year we took it back to the car's roots in the north of Europe and embraced another thing we love about Sweden, IKEA! Blue base with yellow strips, we had similarly nice costumes and even named our IKEA product: the Effinkürber.
Theme organized and car more or less prepped, we headed to Summit Point for the big event.
First day at the track is basically a practice day and the tech inspection -- getting your car checked to ensure that it meets with the low standards of crapcan racing, and bribing judges. We had Swedish meatballs, another brilliant idea of Charlie's. The judges loved them (surprisingly very few showed any reservations about eating gross-looking Ikea meatballs out of a crockpot plugged into the back of a stripped Volvo).
Said meatballs earned us the coveted "bribed" stencil, which we gazed upon with pride. In addition to the four drivers -- Drew, Charlie, Stewart, and I -- we had assistance from Aditya, who for some odd reason decided to crew instead of drive for this, his first race.
With new brake pads, more flat black paint, and one turn of the rear suspension coils removed, we were ready for a day of racing.
P.S. See that awesome allen wrench? That's all Aditya.
The first of a planned two race days was going great. With organized pit stops and long, fuel-timed stints, we made good time. Despite a car that is severely lacking in outright power and/or speed, we were up to 22nd place in a field of almost 130. Yes that field was far too large, and resulted in a lot of laps under yellow, but still, its racing. Drew and Stewart turned in some quick laps as usual. We were all doing pretty well, though, and looking to finish out the day strong.
I was at the wheel for my second stint of the day, heading into the evening and planning to finish out that day's racing. Without a lot of speculation or explanation, let's just say that an abrupt braking move by a car in front, combined with a car passing me at the same time in my blind spot, combined with going for a pocket that was probably too small rather than just braking and getting out of the way, resulted in a fairly extravagant crackup. By me, into a wall. I will simply say that, yes, while careening toward this concrete barrier, I didn't think it would end as well as it did. Actually, what was going through my head was something like "great, I'm going to run right into that, hard (and there aren't even any tires.)" Here's the in-car video. We also have a driver-facing GoPro shot, but its missing the critical last 10 seconds or so right befor ethe crash -- any video format recovery experts out there, I'm all ears as I'd love to have that.
::WARNING slightly scary car crash footage ahead.
The car was totally destroyed, and had to be scrapped at the track (extremely nice salvage company employees, by the way, at Remac Metals -- who knew). I am bruised thoroughly but really only have two badly sprained ankles, feet, and a broken big toe/foot bone -- not bad considering. To this I credit the inherent durability of the Volvo frame design, the expertly-constructed and seemed-overbuilt-until-now rollcage that Charlie and Drew put together, and properly adjusted racing safety equipment in the car.
This shot of the motor is my favorite.
The team is already in the midst of a new car purchase -- another Volvo, though a bigger-engined 960 this time. Should be a great car. After not killing me, we trust Volvos. Also, this time we'll probably have a Hans setup and a more hard-core seat. Oh, and I'll be on the crew.
See you at the track,
I'm psyched about our AddThis release this morning -- the team has been rocking it. Here's what gets me going.
We released new social plugins to help websites increase traffic and engagement. Here's the official summary post.
I won't go too much into it here as its covered pretty well in the blog post, but all of the new tools are backed by the typically-great AddThis analytics. But we're also now able to tell you about text that users are copying out of pages. Want to see the terms that are resonating with users, and might be good candidates for some intra-site links, or even be part of your SEO plans? We can do that. So on top of measuring ALL sharing on your site, we're measuring all sorts of additional user engagement as well. And yes, you can do this even without using AddThis for your site's sharing.
The most obvious piece of the release is probably that addthis.com looks completely different. Its a wonderfully clean and sophisticated visual presentation, and I'm extremely proud of the design team for pulling this one together. We even have some real-time geo data on the home page, no fakery. Some of the team that worked on this aspect of things will probably be talking a lot more about it on their blogs -- check in with Jim, Foo, and Jeff.
How big is AddThis, and how many customers do we actually have? We've put out some new public numbers:
We were big data before it was cool.
We've been known for a long time as a sharing tool. We're the biggest and the best, so great. But we're a lot more than that, and this is the first time we've been really out there talking about the extent of our platform and its capabilities, under one unified AddThis name. Surf the site, see all the different things we can bring to the table. Pretty proud of the portfolio: big data + social infrastructure.
So we've got a lot going on here today. And this on the heels of another announcement I'm proud of. I'll update with the press coverage.
Oh, and we're hiring.
The mod list:
One of the random internal projects I've worked on at Clearspring is the tool we use for managing development sprints. Its called TracBoard, and its an interface on top of the open-source Trac ticket management system.
I didn't have any interest in maintaining and using separate systems for defects and detail work as well as for overall task management. So while there are a number of lightweight sprint planning tools out there, and a number of detailed issue-management systems, getting both integrated, with an experience we like, was not in the cards. So I built a relatively simple front-end for Trac data that gives a more whiteboard-style view.
Its not the most robust thing in the world (I built it as a quick internal tool), but hey, it works for us and if there's interest in it, we'll put some more energy into taking it forward for more general use. You can check it out on GitHub.
What's old is new again. Basically the classic Bonneville style but given a healthy dose of modern reliability, electric start, hell it even has EFI.
The current mods list:
A few things I also have, but not currently on the bike:
More pics are in my Flickr set, as well, here's a glimpse:
Happy modding (and rockering).
This bike started as a 1992 Yamaha Seca II, a well-known 600cc all-arounder. I bought it for about $1100, in decent shape.
Here's roughly the order of operations for the redo. It took a couple of months of late evenings and early weekend mornings before the kids wake up:
Here's what it looks like now:
OExchange (an open spec project I'm helping to drive) got a good bit of coverage in the tech blogosphere this week, starting with its formal announcement on Wednesday. Here's a rundown so far. Congrats to all involved!
“As from a large heap of flowers many garlands and wreaths are made, so by a mortal in this life there is much good work to be done.”
Open URL Sharing Protocol OExchange Gets Support From Google, Microsoft Et Al.
digiday:DAILY - OExchange Protocol Standardizes Web Sharing Tools
New Protocol Attempts to Standardize Sharing
What the OExchange Protocol Means for Site Owners | WebProNews
Google, Microsoft Back OExchange Social Sharing -- InformationWeek
Social Web Blog: Introducing OExchange: An open protocol to simplify sharing
OExchange: un protocolo para compartir enlaces, pensado para editores
y servicios web – Internet – Noticias, última hora, vídeos y fotos de
Internet en lainformacion.com
Google, Microsoft & LinkedIn Unite for Open Web Sharing Protocol – OExchange
A Short Tech Introduction to OExchange - things
OpenWeb-Notizen: XAuth, OExchange, Firefox Sync, RDFa - notizBlog
OExchange: Open URL sharing protocol endorsed by Google, Microsoft and more
OExchange Tries to Make It Easier to Share on Web, Becomes Option to
Facebook and Twitter
What is OExchange? | Scott Scanlon Website
OExchange Wants to Standardize Shared URLs - Technorati IT
Can OExchange Become The Standard For Sharing?
AddThis Blog » Blog Archive » The Future of Open Sharing: We Call It “The Web”
OExchange creates an open sharing services protocol
I put up a post over on the AddThis blog laying out a view of the "open sharing" landscape. Comments welcome.
OExchange is an open spec that I've been involved with for a while -- it provides a protocol framework for sharing URL-based content across the web. You can get more general background on the site, but here's a quick rundown of its actual technical details. You can also just jump right to the Quick Start Guide if you want to start supporting it.
OExchange deals with Sources, sites that have content to share, Targets, services that can accept this content (like social networks, translation services, whatever), and Users, people that use these things. There are three general pieces to the protocol:
1. Content exchange. How does the content actually get from the source to the target?
In what is known as the Offer transaction, sources send targets content by directing the user to a browser-based endpoint that takes the URL as an argument. For example:
The target site ingests that content in some appropriate way, and messages the user when finished. The source will have sent the browser there in its own tab. This simple case is (intentionally) compatible with a huge majority of services deployed live on the web today, and is the minimum compliant OExchange transaction. There are additional content parameters that the source may pass, and the target may accept, all on an optional basis. These include things like Flash objects and image URLs. The key concept is that there is ALWAYS a URL being shared as the primary entity, and that URL may also self-describe in a variety of extensible ways. Take a look at the Offer specification for all the details on the call.
2. Service discovery. How do you figure out how a target works, or even that one exists? How do you integrate with targets you don't know about beforehand?
A target should specify its details, including the location of its Offer URL endpoint, in an XRD document that resides somewhere on the host. This is just an XML document that uses a set of specific tags, and looks like this:
<?xml version='1.0' encoding='UTF-8'?> <XRD xmlns="http://docs.oasis-open.org/ns/xri/xrd-1.0">
How do you indicate, in a page, where the location of the sourcecode for the software in question is located? How about something like this?
<link rel="source-code" href="http://www.example.com/git/whatever" />
How to Use It
The relation, simply put, allows you to link to a resource that contains the source code related to the current resource. The relation takes on two primary uses:
What do you think -- sensible? Does anything close to this exist already? It seems there is a wide variety of ways to get a relation into common use, ranging from formal RFCs to a shared wiki to just throwing it into the blogosphere. But hey, I just wanted to float the idea.
[Part of my Mazdaspeed 3 project series -- see cars]
A video of this car from AFKfest 2009 in Baltimore.
And a bit of background in the NY Times.
I have a 2008.5 Mazda MAZDASPEED 3 (affordable and speedy) with a growing list of modifications. A full custom Car PC/carputer setup, audio system, a bunch of appearance mods inside and out, and some engine mods next on the list.
This is a summary of what I've done to the thing in general, check sub-pages for details on the electronics and software stuff.
Here's a shot of most of the interior:
Waiting for the warranty to lapse first, here's what's planned:
There's a lot of stuff going on here, including an audio and carputer setup.
Take a look at my ELECTRONICS PAGE.
Interior, exterior, PC, and install pictures are all in MY FLICKR STREAM.
This kind of thing is easy once you find people to talk to about it with you. Specifically, many thanks to all of the good people over at:
Also, though it can get a bit spammy, its nice to be able to find other installs on CarDomain.
[Part of my Mazdaspeed 3 project series]
In the course of working on my MAZDASPEED Car PC I came up with a few graphic and other assets that helped in all kinds of different skinning and customization tasks. Most of these are edited or chopped up from original open source stuff found online, some with pretty substantial edits, and some are original. Either way, if you're working on a Mazda setup, or any other setup with red/amber interior lighting, feel free to grab anything here. You can also check other pages for stuff specific to DashCommand, Centrafuse, or backgrounds for Alpine head units.
A few silly audio clips that I use as system start/shutdown sounds, made by recording a text-to-speech voice. You know, for kids. "Save as" to grab the clips.
Looking for something red/amber and abstract looking for the background on your car pc desktop? Here are a couple I use (if you can figure out who made these originally, let me know, as I can't find the right attribution other than to know that they were free)... Click for the full-res version and save-as to download.
It took me a while to find logo type images I liked, for Mazda as well as MAZDASPEED. I then spent a while tweaking them for different sizes, color schemes, and formats. I use these, for example, in my Centrafuse skin and on my Alpine head unit skin. Here are a few representative examples:
You can download a bunch of different versions of these, in different colors and sizes:
And because no system is complete without the uber-retro Monaco font:
[Part of my Mazdaspeed 3 project series]
Centrafuse is a so-called "front-end" application for Windows-based Car PC setups. Front-ends are designed to be the primary thing running on a system. Their interfaces are optimized for touchscreen use, and serve as the user interaction point for everything else you do with your PC while driving. Though you can certainly just launch (and switch between) applications as you normally would on a Windows system, using a front-end really does improve the look-and-feel as well as the experience, and makes the Car PC feel more like an OEM system.
Some points about Centrafuse, specifically:
There are several other front-ends available, including the popular Road Runner (now RideRunner) and StreetDeck, as well as a host of lesser-known ones. While in my opinion they all have the same general capabilities, I prefer Centrafuse overall (even though its commercial and closed-source). It has the right balance in terms of functionality, customization capability, and design -- it doesn't do everything, but it does what I need it to well enough without a ton of setup work, has good-looking skins, and is pretty stable. I have plenty of feature requests but no huge gripes (well, beyond there being no built-in or plugin RSS reader).
In my MAZDASPEED 3 I use the Aura BW Red skin (a very simply modified version of the default Aura) for daytime use, and the Gizmo Red skin for evening use, with Centrafuse set up to switch between them automatically. Gizmo Red I especially like, it looks great at night. I made a few simple tweaks, including changing the logos, to make both of them fit with my car, but nothing major (I used some Mazda-specific graphic assets for the logos and startup screen and such). The skin ends up looking like this (clickthrough for more):
I did have to make one small tweak to Gizmo Red's config file so that the media controls would NOT be visible when in fullscreen mode (which was messing with my embedded apps), but alas, not a big deal. Also, note that the colors in the image above are not perfect given some weird screenshotting issue; the color is in reality much more reddish.
Though it probably isn't the best idea in terms of paying attention to the road, one of my favorite things about having a Car PC is the audio visualizations. Centrafuse actually uses Sonique's visualization system, so you can install other visualizations than what Centrafuse comes with. Some of them do take more CPU, but IMHO they are worth it if you like visualizations. There are a bunch here on XMPlay, and I've copied some locally since I'm not sure of the status of that site.
Generally I use the Navigation, embedded DashCommand, and the media playback the most in regular use, with the occasional net-enabled activity (like a twitter client) just for fun or to demo to somebody. I've also found that there are a ton of plugins available that, frankly, just aren't that interesting at the end of the day. I'd rather keep things focused and pull out functionality that I don't actually use all the time or that doesn't make a cool demo. Some highlights of my setup:
I am still looking for a better way to get Twitter, general RSS feeds, IM and Skype integrated. Any ideas on a smoother integration, let me know. Haven't had a ton of luck with the CFSkype plugin, for example. I'm also keeping an eye open for the soon-to-arrive major upgrade to Centrafuse, though as I've said before there really isn't a ton of stuff I need it to do that its not already doing. I you give it a try, good luck! Note also that Centrafuse has a forum on their own site, as well as a section on MP3Car.com, so there's plenty of help available from other folks.
[Part of my Mazdaspeed 3 project series]
One cool thing about the Alpine head units, like the IDA-X100 I have, is that you can load custom backgrounds onto their full-color LCD displays. Its a small screen, but when its one of the few things in the dash its nice to have it match the color/logo scheme, especially when there's a Car PC involved.
Alpine has a pretty mediocre web tool, i-Personalize, where you can download new backgrounds for your head unit. Once you've got an APN file (some kind of non-standard image and metadata encoding), you put it on the root of a USB mass storage device, plug it into the head unit, then select the custom background option via the head unit's interface. What's not immediately clear, though, is how to create your own APN files based on your own images. I came across this great tool, which does the basic conversion from image files of various formats to the APN container format, and then lets you download it (nice work, Josh). It works for at least this family of head unit, not sure about others.
So again, the process goes like this:
I made a few backgrounds for Alpine units that loosely match the amber Mazda interior, based off of some of the logo and background assets I've used for general skinning and customization of my own car pc setup. Feel free to grab the images, or the APN files themselves.
PNG source image
APN background file (save link as...)
PNG source image
APN background file (save link as...)
Hopefully this helps, good luck with your customization efforts. Let me know how it goes or if you discover anything else useful.
[Part of my Mazdaspeed 3 project series]
I have a large and growing set of electronics -- audio and computer/carputer/Car PC gear -- in my car. Here's some information about it (its a Mazda MAZDASPEED3).
Here's how everything is laid out:
As far as audio-only goes, I have a pretty-good aftermarket audio system, sounds great but didn't cost a ton:
The system gives me plenty of input options, except for a CD, by design, and I use it primarily with my Car PC on the AUX line (hard-wired from a USB sound card in the dash). The amp and power distribution is in the hatch, more details as part of my install overview.
In the top part of the dash, I took out the factory head unit and replaced it with a double-din-capable dash piece I got from a guy in Japan. I took apart the Liliput touchscreen and built it into the back of the double-din piece, which meant I was able to use the bezel that came with the screen, once cut up a little bit. It ended up coming out fairly well, as you can see in the pics.
I put the new Alpine head-unit in an area in the center console previously housing a lighter, coin tray, and other useless stuff, which required fabrication of a mount for the unit, and a plexiglass trim piece to frame it in. I also put a few turn-on switches here (one for the head unit and one for the PC). All of this involved some serious cutting up of the center console. I used plexiglass specifically to give it the gloss black look to match my piano black factory trim.
I made a panel to fit the opening in which a pair of cup holders sat originally, and replaced it with the Space Navigator, a pair of arcade buttons wired to a Mini-PAC key encoder, and an LCD screen. This bolts into the plastic and ends up sitting nicely in the center console down where it is easily accessible, just to the rear of the shift knob.
In the hatch area, I took out the floor and the spare tire and built a custom sub box and a false floor. The top of the sub box sits higher than the false floor, such that with the equipment mounted on the floor, the trim panel that forms a false floor on top of that (with 3-4 inches between) sits flush with the top of the sub. This gives an overall flat floor in the hatch, with a recessed sub cone and a recessed tray of sorts for all of the equipment. This gives me an area in which to mount the current as well as any future electronics in the trunk, in a nice and stable place with plenty of room for reconfiguration, new equipment, whatever. I intentionally set it up this way rather than cutting trim panels to fit specific equipment, as I am sure things will change with the install over time.
To connect all of this, I had the car completely apart and installed all of the wiring that way. And, through it all, I only ended up scuffing a few spots, nothing noticeable compared to the damage its possible to do when taking your car apart over and over again.
This thing is Windows XP. I debated this for a long time, having wanted to use MacOS or some kind of Linux flavor. In the end, though, there just isn't as much software available for the other platforms. Its nice to be able to write code and all, but unless you want to write a ton of stuff and make the coding a majority-time hobby as well, go with Windows (I already have enough coding side projects). The front-ends for Windows are definitely nicer.
The main software in my system is:
Getting the PC running the way I wanted it took more time than I thought it would, and I didn't even do a whole heck of a lot. Notable things:
The main things I may still do are modify the boot/hibernate/etc screens via some customization tool, and modify the BIOS logo. I haven't done this so far because I still have back-burner plans ot power the LCD on via a relay that would be triggered by FusionBrain on Windows startup, so you'd never see anything prior to the shell regardless.
So again, feel free to download any of the stuff mentioned here and on other pages
Main things I learned and need to remember, if they're helpful:
As soon as things started looking normal again, I started realizing how much else I wanted to do. Things on the plate:
Work-in-progress stuff is in my mp3car.com work-thread.
There are a bunch more pics of the exterior, interior, mid-install, and screenshots of the PC in MY FLICKR STREAM.
[Part of my Mazdaspeed 3 project series]
DashCommand is a commercial, real-time OBDII application for Windows. Its designed for virtual instruments on touchscreens and in full-screen mode; basically for in-car gauge applications. I use it as part of my Car PC for my digital gauges (with my PC hooked into the car via a USB OBDII reader).
DashCommand, itself a cheap app, is part of a product suite that also includes some higher-end packages for in-shop and tuning use. I picked it, and still use it, because:
There are frankly also some less awesome things about it:
All of that said, I still think its pretty much the best option for building highly custom virtual dashboards for in-car Windows Car PC systems. I had toyed with the idea of building a daemon (likely in Java) to read OBDII parameters and present them on the network, then building a Flash or Flex front-end to display those (since there are a lot of nice Flash gauges out on the web). In the end, though, I wanted something that wouldn't require so much time and energy to get working. So with that, I'd recommend DashCommand to others with the same constraints.
My MAZDASPEED 3 has all-amber interior/dash lighting, so my whole Car PC setup matches that. That involved skinning DashCommand along with everything else. I made both a DashCommand "skin", which controls the main buttons and options in the non-dashboard part of the interface, as well as two different "dashboards" (the things with the actual gauges). Both are pretty simple once you get the hang of them, though the customization can be overly tedious with the provided editor, as noted above.
The skin asset is just a ZIP archive of individual DashXL files (.dxl), one for each of the main UI elements. I just pulled apart the example skins to see how that worked. Drop the ZIP in your skins directory and then load it up through the app's settings interface.
For the Dashboards, here's my "Mazda Factory" dashboard, which I made to look mostly harmonious with the actual factory gauges in my car:
And the DashXL file...
And lastly, this is what I actually drive around with most of the time -- its not the most readable but I like it aesthetically. I was going for a simple aircraft-style style display, with the linear-scrolling speedometer on the left:
And the DashXL file...note that you should have the Monaco font on your system for this to look like the screenshot above:
Feel free to download and customize, any feedback's welcome of course. I am still debugging these a bit, so there may certainly be issues here and there. Let me know in the comments if you do try to use these and how it goes.
Since I use Centrafuse for my front-end, getting my virtual gauges integrated was a simple manner of adding DashCommand as an external application, launching in fullscreen mode, and setting the app itself to launch in fullscreen, hidden-mouse mode (DashCommand has command-line flags available for this). The only moderately non-trivial aspect of the whole thing was designing my dashboards to have an aspect ratio that maps to the dimensions of my screen with Centrafuse title bars and nav controls. In other words, though DashCommand can scale the UI to the appropriate rectangle when its loaded, I set my dashboards to have fixed aspect ratios to ensure that my gauge display always looks optimal when running inside Centrafuse's frame.
There's been no shortage of discussion on Facebook of late -- Facebook as the new Google, Facebook vs MySpace, etc. etc. -- and their overall strategy deserves some discussion, to be sure. But you've read all of that by now. As with any great period of hype and furious activity, there are also by now some misconceptions about what you can actually do with Facebook from a practical perspective. I obviously have no idea what I am talking about, but here are some observations after writing some Facebook code (for Cruxy) for a little while now, and why I think all those Facebook apps in your sidebar aren't what is going to matter.
In general, the power of the Facebook platform is NOT in the Facebook-resident applications, but in the non-Facebook applications that use Facebook services. There has been some discussion on whether Facebook apps are widgets, whether they offer better extensibility than MySpace (here, here, and here are some good examples), and related. There are some good thoughts here. Personally, though, I think the compelling point isn't that Facebook is a new platform into which you can inject dynamic application content (widgets or otherwise). It is that Facebook offers a platform API that you can use to build social networking functions into your own, separate, application. This is clear, but this is not what everyone is talking about. Even though Facebook is the cat's pajamas right now, and yes it's a cool destination that a lot of people go to, over-time it'd be hard to imagine a single destination where everyone in the known universe would go to use all kinds of disparate applications integrated only though their set of "friends". Social networks, over time, are inherently too interested-oriented for this to be a viable model. The brilliance in this overall move by Facebook is not as a super-extensible, friend-to-all-developers application container, it is in its ability to, through the APIs and not just through hosted canvasses, offer a core social-networking library to web developers. How relationships are managed, navigated, etc., is value Facebook can add. Being a one-stop shop for all of your application needs is interesting at the moment, but not going to be the long-term success of Facebook.
For anyone that doesn't agree with my general premise that big media and big tech will turn web 2.0 into American Idol style boredom, CBS acquiring last.fm is arguably a good example. I am a long-time user of last.fm, and a huge fan of the AudioScrobbler stuff underneath it. It was great before it was a social network. I was a user when the blogosphere was down on them. The CBS news is great for the last.fm team (congrats), but as a general sign of the consolidation trend...hmm. There's nothing inherently wrong with CBS buying them, but for anyone that believes in the culture of innovation, startups, and the true long tail, this situation in which increasingly the only exit for web startups is acquisition by one of 15 or 20 big players is nothing to be happy about. I loved FeedBurner, I loved Keyhole, I loved last.fm. It's the "big-boxification" of the web, my friends.
Warning: This (long) post contains rampant speculation, unfounded arguments, and poorly structured reasoning.
Web 3.0 seems to have entered the lexicon of the blogosphere, though it remains undefined. Well, a lot of folks define it as the semantic web. This is cool from a CS perspective, but the 2.0 nomenclature wasn't fundamentally a technology designation, so it doesn't really seem right that 3.0 should be. The semantic web should really be web 2.0 in that sense, but anyway. I'm a little more interested in the web's evolution through usage and business models. So in that sense, what's 3.0? I play the part of a psyched-about-the-future geek all day, so I'll make the devil's advocate argument for the sake of contributing to the discussion -- whether I believe it or not is sort of a separate question (I don't). It's good to argue against yourself once in a while, right?
The premise is what I'll call the American Idolization of the web: The next phase of the web experience will look more traditional , with a standard blend of "mainstream" and "alternative" content powered by a federation of a small number of large players, than any of us geeks want to admit.
Taking an industry/social-trend view, rather than a technology-centric view (semantically-aware, user-mashupable, etc), of the web's next phase, we're closer than we might think to the point of being able to use Tim O'Reilly's "this is qualitatively different; let's call it web 3.0" rule. Whatever we call it, the point at which the current 2.0 version of web hits critical mass with the mainstream is a big enough deal that we should call it out, it's already happening, and it may not be as cool as we all thought.
So where have we been?
We'll focus on these as (one guy's version of) practical reality in terms of how the industry and the technology have developed -- we could talk all day about the original intent of TBL and others, whether "web 2.0" was a useful name, whether what we call web 2.0 now was simply what was really intended from the start, and whether there were social networks 15 years ago. Here is a good jumping off point to a relevant set of discussions with O'Reilly and Tim Berners Lee, two smart guys.
Back to the premise. We all love to talk about AJAX, tagging and folksonomies, microformats, user-generated content, personalization, widgets, social networks (or social networking platforms), and the rest. I love Clay Shirky and David Weinberger and their views on structuring information, and the debate over the semantic web, strongly vs loosely defined taxonomies, RDF, etc. I love Marc Canter's Digital Lifestyle Aggregator concept, which presents a view of the class of services that will tie all our crap together and help us make sense of it. I love the distributed computing paradigm that simple widgets and start pages will evolve to. I love this stuff because I'm a geek. But I love to talk about it in the same way that those crazy Spanish cooks love to talk about foams and airs and their influence on world gastronomy. We're the Dean (or Obama?) supporters, and its great to be there, and great stuff happens, but Bush still won. It's easy to forget about the poor old mass-market user, and what's reasonable or unreasonable to expect that they will do (and pay for), in the midst of this debate.
I'll use my lovely wife as a use-case. She has a highly-successful corporate career, a huge professional as well as social network (offline), and couldn't give a damn about del.icio.us or the Facebook platform. She's an email/calendaring user, an e-commerce shopper, an online bill-payer, even watches stuff on YouTube -- she's like a lot of "regular people" in her use of the web. She likes geni.com not because it's a social network play for the family or because it has a cool Ajax UI, but because it's an easy way to create a family tree. What does she really care about "web 2.0"? She doesn't, and neither do most people. And by most people I mean most people in a certain set of socio-economic bands, which still represent a minority of the population -- we're not even talking about the rest of the world.
Web 1.0 (and prior) kicked off a massive cultural shift -- it really was, and is, something revolutionary. All of a sudden people began to communicate with one another in totally new ways, began to take advantage of daily-life convenience services never before thought possible (my parents tell me that balancing a checkbook on Sunday evening used to be a major event), gained access to new amounts of information about themselves, their health, their communities, and their governments. Parts of the world began to open up with new definitions of access, and average folks got their feet wet with the concepts and some of the basic tools (e.g. an email client, a browser).
This has been a continuing journey over the last several years -- more and more applications, better and better interfaces, delivered via greater and greater penetration of broadband and PC access. Personalization in e-commerce is just one of many great incremental technical achievements we've seen large groups of "regular" users benefit from. Developers and technologists have been feeling their ways around the workings of these things -- learning how to take a service-oriented view of the world, how to build richer user interfaces in a browser, the virtues of content syndication, and how to collaborate with one another more and more effectively. Similarly, users have been experimenting. It will be clear in the final analysis, if its not already, that the emergence of the blogosphere, of big-S-big-N Social Networks, and of user-generated content in general over the last few years were seminal events, but they were just very introductory baby steps. Web 2.0 is really about slight technical paradigm shifts and and online social experimentation, and makes sense in the long run only as a necessary interim step to what comes next. Again, not in terms of what the formal definitions may be, but in terms of mass-market adoption, end user behaviors, and business applications (valley VCs don't count as representatives of modern global business).
The Next Wave
What's next, then, what we might call web 3.0, is when the participants collectively hit their stride with what are now more accurately seen as anomalous events, and when the real money starts coming in. Take a look at the distribution of information across the web today, how even though there are millions and millions of nodes on the web, it's a small number that get most of the traffic, hold most of the data, and provide most of the widely used applications. Though there are millions of blogs you can read to get news and opinion, users don't read millions of blogs, they read a few (not including mine) or, more likely, they read a summary view that is aggregated by someone else, and someone that looks a lot more like mainstream media (my hero Dave Winer has some other thoughts on MSM in web 3.0 here). They listen to NPR, maybe watch FOX. They don't surf all the Technorati results, they go to a few aggregate locations. They don't use feed readers or start pages to assemble their own custom dashboard of applications and content from across the web. They watch video online, but they still sit down to watch American Idol as-its-broadcast and spend more time with a set-top box than with a PC hooked up to the TV. Again, keep in mind we're speaking about general population here, not us technophiles.
Web 3.0 is where we'll see the convergence of MSM outlets, main-stream portal providers, and big enterprises sucking in all the cool stuff that came on in the scene in web 2.0 -- that's where the vast majority of users will experience media, get and share opinion, accomplish the tasks of life and work, and be entertained. There will always be a huge current of leading-edge users that embrace the full spectrum of services available to them, in real-time, and remix them to their needs, but in the grand scheme of things this is a tiny minority of users. It isn't until the user numbers on these things increase that these concepts get really interesting. People consume Digg stories indirectly because the interns at CNN read Digg and then feed them up for the main coverage, people participate on message boards of major TV networks to talk about their favorite character's hairstyle, and enterprise users start to see more Wikis and blogs deployed internally.
And with respect to social networks, we are talking about big numbers already. But we're also talking about hugely transient models. We've seen a few networks now come and go. We've seen that there is a high degree of tolerance for, and interest in, people having individual presences on the web, that they can interact with and connect to others and to their interests. We've also seen that while that's interesting, there's not a whole lot that's new in the grand scheme of things -- it's just a tech-powered societal update to hanging out at the mall, going to a tupperware party, whatever. It's not the same level of transformation that took place when the web first began to see widespread use in the 90s. The real transformation there will be about the long-tail networks. Some of the longest-lived communities on the web today still exist as collections of people that email one another, or maybe even hang out on some crappy old bulletin board implementation.
So wrapping up this devil's advocate view of what mass-media Web 3.0 will be, some trends we'll see:
So What Now?
Nothing -- remember we're just playing devil's advocate here, and I'll probably post the opposite side of these arguments at some point soon ;-). If you read all this you get a gold star.
There's plenty on the web about web 3.0, the semantic web, and other people's opinions on where it's all going. Here are a few posts I personally find interesting:
There's lots of interesting activity in the platform-play world these days, and all the recent Facebook stuff is a good excuse to talk about it. Platform strategies have been written about extensively over the years, but it seems like a new gen of web 2.0-style players have gotten the platform bug. While most web 2.0 strategies include an API component, by platform we mean the extension of a core API/dev offering to include the larger ecosystem of applications, tools, developers, and partners around it. Here are a few interesting examples (my opinion) of some of the recent platform strategies coming out of what we might loosely call Web 2.0:
Parting thoughts: two great "classic" pieces on platform strategy for your reading pleasure:
DISCLAIMER: There has been quite a bit said already about the challenges that convergence (of voice/data, wireless/wired, vendors, user-experiences, device types, etc.) brings to the telco industry, and how various players are, or are not, adapting to these challenges. The smart folks at Unstrung, Gartner, GigaOm, Telco 2.0 and others have thought, and written, long and deeply about these issues. I speak with some purely personal observations.
I'm an Interweb guy that worked for a big-4 US carrier for a few years, and lived to tell about it. I was one of the "Internet people" there, amongst a majority of "telco people". Back on the other side, and with a keen appreciation for how closely these two domains are tracking toward one another, I am struck again by the vastness of the schism between them. More significantly, though every carrier gives lip service to convergence, IMS, and the need for change, and everyone sees the writing on the wall, the opportunity for a name player in either space to take small but effective steps to rectify telco-land and Techcrunch-land remains on the table.
At the end of the day, if this is all going to get better, and if we're really going to see the next wave of mobile apps, we need to open the dialog. Why is there such a dearth of carrier discussion at all of the web 2.0 conferences (or, better, let's have some unconferences)? Why is it only a handful of bloggers that hit these topics regularly? (here's a great one) Why does the majority of the discussion on the future of mobile services from the blogosphere focus so much on the advertising model and on spectrum positions? There are any number of other topics we should be discussing at the community level, in the same way we discuss the virtues of this or that way to distribute music online, or who got how much VC money.
Others have started the dialog, here's to keeping it rolling!
I've thinking a lot lately about this whole situation with the valley, the boom, TechCrunch, and my experience being in a startup somewhat outside of that insider world (though being quite fulfilled thank you). And, Lo, Mike Arrington finally breaks down and does it himself. This is really priceless. Ok, couple quick thoughts:
This one bit of insider drama is much welcomed by those of us outside the fishbowl. Good show to Mike for at least saying how he feels, now let's see what he can do by channeling it in a positive direction -- he has the power, for now anyway.
subscribe via RSS