Posted to the Haskell mailing list:
I'd like to announce Turbinado, a very young and raw MVC web framework for Haskell. While the framework doesn't exactly copy Ruby on Rails, it certainly rhymes... It's very early days for Turbinado, but the framework is moving along nicely. There are still issues to be ironed out and architectural details to be decided, so help/contribution would be very much appreciated. Turbinado can be found at: <a rel="nofollow" href="http://www.turbinado.org/" target="_top">http://www.turbinado.org</a> The source can be found at: <a rel="nofollow" href="http://github.com/alsonkemp/turbinado/tree/master" target="_top">http://github.com/alsonkemp/turbinado/tree/master</a> (see the /App directory for the code for www.turbinado.org) Turbinado: * Provides a fast web server (based on HSP; see <a rel="nofollow" href="http://turbinado.org/Home/Performance%29;" target="_top">http://turbinado.org/Home/Performance);</a> * Provides a straightforward organization for your website (courtesy of Rails); * Uses simple HTML-like templating (courtesy of HSX); * Is easily extensible (courtesy of an Environment built out of _Map String Dynamic_, not the most type-safe of beasties; Help!); * Configurable routing (see Config/Routes.hs). Turbinado is currently lacking: * Documentation... * An easy install... * A database ORM based on HDBC (visibly incomplete and ugly in Turbinado/Database/ORM); * Many more HTML helpers; * Controllers for partials (lightweight "controls" ala ASP.NET); * Strong error reporting and handling; * Lots of functionality and plugins; * ... the favorite feature that you want to develop for Turbinado ...
See here for a much discussed article on how RMS has said he hates cloud computing. I’m usually pretty on board with what RMS says (after I filter out the uber-geek-bias), but I’m on the fence about this issue.
My counter argument to RMS’s argument is: Damn!, but online services are just too handy to leave aside in order to maintain software development philosophical purity. Would you give up online e-mail, social networks, etc?
Besides, I love having fewer applications installed on my computer. I install and focus on the applications I really, really need and the other needs are served out there in the cloud.
RMS’s comment strikes me as purely academic with little consideration for What Really Works. I hope that online e-mail doesn’t go away, but I’d love to know how to make it more GNUy.
BusinessWeek has more column inches on the pressing subject of the decline of newspapers… I wrote a bit about this last week, concluding that we should probably be more worried about what to do with all those big-brained reporters than about what to do with newspapers. Jon Fine focuses a bit on the topic of what to do with the reporters, but (probably courtesy of not looking very hard) I haven’t generally seen much discussion about what to do with the reporters, which are the big asset in the newspaper biz. And if we have had loads of discussion about how to fix the industry, then why are we still writing articles about the fact that the industry is dying?
I’m comfortable with Firefox. Stable, loads of developer tools, etc., but Google’s Chrome looks awful nice and is screen space efficient. Fortunately, some lovely fellow has made a Firefox add-on to make Firefox look like Chrome:
This add-on hides the Firefox toolbar (as opposed to Chromifox, which only provides colors and images).
“For example, the Times‘ software architect Derek Gottfrid showed me his Times Machine, an app that lets you search and read old copies of the paper. I’d pay for that.”
I’m sure that the Times Machine is super cool, but I’m a bit puzzled by these articles. I’m an old man (when measured in Internet years) and I have no idea why/how articles on how the newspaper industry can thrive/survive are relevant to me. Scoble has to fill column inches, but he’s kinda trod this ground before.
Big Juicy Brains
Seems that what we really care about is how we keep all of those big-brained investigative journalists employed gathering news… And a Times Machine app doesn’t seem to be the solution to that problem. Blogging doesn’t feel right, either. Loads more fun certainly, but blogging usually serves up derivative works, not the original, investigative pieces of the big newspapers.
I’m not strong on the history of the Associated Press (or any of the other news production aggregators), but the Newspaper vs. Blog situation certainly points in the AP’s direction. Newspapers mix advertising and content origination (reporting) together in one big package. They also pay the AP for larger pieces of content that they’re happy to share with all of the other newspapers: international news; local stories which have high national relevance; random science stories; etc. Basically, anything that the paper wants to report on, but doesn’t want to hire the people to do the reporting. The Newspaper v. Blog situation suggests to me a complete decentralization of news production and news distribution, as opposed to the current situation in which newspapers mix the two.
So now we’re in the glorious future in which the AP produces news, legions of bloggers distribute the news and you subscribe to the feeds of 4-5 of your favorite bloggers so that you get both their news selections and their news interpretations. Lovely. So how does the AP make money? Naturally, the bloggers are all making money on advertising. So the AP should get a cut of the advertising revenues, but the AP can’t trust me to accurately report my advertising revenues.
The AP could create a partnership with a large advertising network or two and adopt a Publish-My-Content, you Publish-My-Ads business model. Adopt a rev share (60% to publisher, 40% to AP ?) and let the blogosphere go nuts with their content. (Probably need to hire a brigade or two of lawyers to make examples of a few people who’re abusing the system.) I’ll get the bloggers I love feeding me digested bits of AP content with the particular spin I love. I’m happy to be advertised to and the blogger will probably be happy to share revenue with the paper in order to get access to primo content.
Lots of wrinkles (e.g. how’s copyright affect this idea?) and implementation details (e.g. dude, you should totally use AJAX) left as an exercise to the reader, but it certainly feels like the newspaper industry is going to be decentralized. The sooner the big papers (read: NYTimes, LATimes, WSJ, etc) get out in front of the decentralization the better.
Throw Away Line For Those Writing Hand-Wringing Articles About The Death Of The Newspaper Industry
Won’t someone write an article expressing deep concern about the health of the printing press industry?
Since we run a membership-fee-based service, we’ve focused on providing security for our inside-the-walled-garden URLs. It’s been important to protect the content generated by our members for our members from being scooped out by unscrupulous operators. That said, we’re probably getting to the point at which we should pretty up our URLs, if only to enhance our member experience and ease with which members can find their content.
In searching around for guidance on how to go about switching to pretty URLs, I’ve found the following helpful:
You’ve probably heard of Six Sigma, the quality management philosophy/practice started at Motorola. Great way to identify problem areas of product design, development and production. Sets a goal of having 99.9997% efficiency in the product life cycle.
We have a fairly traditional SDLC process for our web site development and I’ve been thinking about what the “right” process is for our development efforts. While poking around, I’ve seen a few blog posts about applying six sigma to web marketing or software development.
Got me thinking: should 6 sigma be applied to web software development? Maybe 4 sigma would be better?
Really, do I want NetFlix to provide me with 99.9997% quality in their website? Well, they better handle credit cards with 99.9997% accuracy, but I would prefer that they provide very good quality on their website and focus on rolling out new and improved features. I wouldn’t be too irked if their recommendation engine got better and, in return, I saw an occasional visual flaw or 404. Considering that most web flaws are quickly spotted and easily fixed, it makes some sense to “release early, release often”: don’t overinvest in quality when fixing the issue costs little in terms of time, money or reputation, especially when the benefit of additional features to the user is great.
What about Microsoft Windows? Unlike a web application, I have to install this bit of software and, unlike a web application, it’s not easily updated.
So I’ll vote for Four Sigma quality levels on non-critical areas of consumer websites. 99.379% quality in features is pretty good and I won’t lose sleep over the 0.621% of features that have some issues… But we’ll commit to fix them ASAP.
Since I’ve used Ruby on Rails quite a bit in the past and since I’m now using ASP.NET, I often find myself doing comparisons between the two frameworks. Recently, I found myself comparing Rails’ controllers/views and ASP.NET’s controls. The following is an example of where ASP.NET’s declarative instantiation/configuration of controls worked well. I’m not sure how I could do the same in Rails.
Optimally, I’d like to generate that code by placing two controls in my markup:
<UserControls:GoogleAdManager Slot="slot1" runat="server" /> <... lots of HTML ... > <UserControls:GoogleAdManager Slot="slot2" runat="server" />
Then on PageLoad each control add its slot code. Control 1:
On PreRender, each control tries to add the following code to close off the previous script tag and to fetch the ads:
Finally, each control renders its code into the div. Control 1:
Gotta dip down into C# a bit for the ASP.NET, but the pay-off is a simple declarative configuration of Google’s Ad Manager. I didn’t implement the functionality in Rails, but I suspect that it would be rather more complicated… ASP.NET wins this round… but I still miss Rails something fierce…
Note: this may have been what Rails’ components did. But they got canned.
Great post about the benefits of using AJAX to shrink a conversion funnel here.
Recently, we were working on a workflow within our site and and spent a bunch of time taking a 6-page funnel down to a pretty AJAXed 3-page funnel. Got it all tested, put it in production!, and …no effect on fallout in the funnel… But, wait, everyone knows that “making a workflow short and snappy will decrease fallout”, right?
The explanation we came up with when thinking about the lack of improvement was that we’d neglected to consider our user’s motivations & incentives when thinking about our funnel.
The funnel we were working on was a Submit A Review type funnel. We’re unusual in that we collect a lot of data from members in that funnel. We get big, comprehensive reviews from members. It takes 2-3 minutes to fill out our review form. Which is awesome for our other members because, when they’re trying to figure out with whom to spend $75,000 adding a room to their house, they get serious data on the possible contractors.
That said, it’s kinda hard on the member that’s submitting the data and the member knows that’s it’s going to be hard, so they only enter the funnel if they’re serious about completing it. The hard work we did on simplifying the funnel made things easier on the highly-motivated members who were going to submit reports anyway. Great for the member; not so great for us (ROI ~= 0%).
Why are these members so highly motivated? Our guess is because their incentive to post a review is altruism. They had a great experience with a service provider or a horrible experience with a service provider and they want to let the other members know. Posting a review gets them a bit of personal satisfaction.
Naturally, the number of members who are driven by altruism is pretty low. So if we want to get review submissions up, we need to add other incentives. The incentive hierarchy probably looks something like: altruism … reminder/call-to-action … community/reputation … chance-to-win-a-prize … $$$. As we use other incentives to drive review submissions, the shorter, snapper funnel will help keep fallout low. So our work wasn’t for nought… Just maybe not for much right now…
Helpr, Inc is joining up with Angie’s List to bring the exciting and rapid development that we’ve conducted within the Facebook ecosystem to a large, very-well-subscribed site. Why?
- Angie’s List is a very large and growing web property which is looking for a great team to join to help push forward their web presence.
- Angie’s List already has a great development team in house, so we can focus our efforts entirely on driving the web presence forward.
- Everyone’s got app fatigue and justifiably so.
- The social local search sector on Facebook just isn’t growing. Check out stats on Loladex and GigPark for evidence. Excellent apps all lost in the app cacophony.
So hurry up! Join Angie’s List and give us feedback on how you’d like to see the site improved!