This is the stuff of code cracking legend: cracking the San Jose semaphore

As a boy scout I was proud of my ability to signal morse code at a speed faster than I could type. But nothing prepared me for the excitement I felt reading about this example of code-cracking geekery.

This stuff so excites the code itch in me.

Its an article that I saw first on the wired blog called “Sleuths Break Adobe’s San Jose Puzzle, Find Pynchon Inside”.
The associated pdf description runs into 18 pages and makes a great read. Check it out and learn how two brilliant individuals cracked what the spinning orange disks conceived by artist Ben rubin were transmitting from atop the Adobe building in San Jose.

An eye for an IE – bioscreencast.com , internet explorer and the javascript jungle

Well bioscreencast.com now supports Internet explorer .. you can read all about this and other beta 0.2 enhancements on our blog.

Since one of the stated aims of this blog, is my desire to learn javascript, this post talks about javascript standards and the google web toolkit…

On bioscreencast, Suresh , our lone web ranger , using the amazing yui library, his web design  skills and tech wizardry, designed the site first beta site to play well on firefox and safari . Both these browsers are  closer to the ECMAscript standards than something like Internet Explorer (IE). Consequently our site initilly worked on firefox and safari.

Like most javascript centric UI’s , the site had to face up to the real problems associated with javascript i.e browser personalities . Javascript is famous for how it behaves somewhat differently depending on whether you are using a browser like Internet Explorer or something like Firefox and safari. The ECMA standard was a move to get people to agree on what is “javascript”. Despite the existence of this standard , its interesting that even two standards compliant browsers …dont necessarily treat your website code exactly the same way!. Which makes navigating the javascript jungle a crazy proposition.

This diversity , provided the justification for things like the google web toolkit. How the GWT works is simple, you code in Java ..the program sits on the server..and when the browser requests a particular URL , the gwt java app generates the javascript depending on the browser..so IE gets IE centric javascript code , firefox,  javascript that suites its palate etc etc…. SO for people who want the “AJAXy” sexiness that goes with javascript..you could just stick to java and still harness the dynamic characteristics of javascript.

So , to summarize, coding in javascript necessitates the ability to deal with many of its dialects..or throwing all that out and adopting the java Google web toolkit.

In any case..it turns out, with a few very minor tweaks to the javascript code , bioscreencast.com now plays very well with all browsers especially IE 6 and IE7.

Based on the feedback we have got from many people and from our analytics..it is amazing how many people are still using Internet explorer 5 and 6, and its good to know that our website now welcomes most browsers…

Put your brain to use: Galaxy Zoo

I caught this on Natures Nascent blog. Like re-CAPTCHA which I had blogged about before. This project uses the human brain to classify galaxies. You know the types , spiral , elliptical , merging etc etc.

The way it works is fun , you sign up, go through the tutorial , take the test ..If you get 8 out of 15 correct, you can start classifying galaxies. No worries, if you dont pass the test..you can just take it again , till you earn your stars .

Once you do , you can start classifying galaxies

A few things about the project capture my interest.

As as far as the codeitch, itch goes ..its amazing how much better the human brain is ,at recognizing patterns like the ones in the spiral galaxy above and telling it apart from an elliptical one. I know image processing algorithms are getting better and better as the days go ( I had my beginings in structural biology with 3d reconstruction of viruses from projection images and some single particle reconstruction)..but the human brain it seems still takes the prize…

The second amazing thing is , the galaxy zoo in just two days , classified a million images with community participation. And their servers are struggling to meet the load.
A big hurrah for public participation and open science indeed.

Powered by ScribeFire.

Image link from the Sloan Digital survey 

Out of the cradle and into a beta- Bioscreencast.com

I am very excited to announce the coming to fruition of a project that five of us have been working on for the last few months.Its a site based on screencasts called Bioscreencast.com. You can read more about the site on our bioscreencast blog post and at my Omics world blog.

The entire site was coded into life by one person, our head web geek and javascript junkie, Suresh.

As a wannabe coder, I came away amazed at the sheer power of the many open source libraries out there, the robustness of mysql databases, the sheer elegance of css..the swiss army knife like ffmpeg , the clumsiness of php, the list goes on. What made the whole process doubly enjoyable was that all five of us are relative web newbies.. and learning how these things work along the way was a lot of fun.

Watching Suresh work his web magic has made me want to learn more of the six technologies I want to master, and I have also added a few more to the list( more on this in future posts)

Just thought Id give my plug for Bioscreencast.com. I hope the life scientists out there ,  like the site,  and all of you will keep your feedback coming.

Links:

The Bioscreencast website

The Bioscreencast Blog

One of our co-conspirator Deepaks intro post

The entire site was coded and crafted by one person, our head geek Suresh .

reCAPTCHA and help the digital library project one word at a time

Adam Weiss on Bostons museum of science podcast recently interviewed Luis von Ahn one of the people behind CAPTCHAs those squiggly lines and puzzles that you solve every time you sign onto a website or post a comment to your blog to prove that you are a human.
They are basically designed so that computers cannot easily decript the message whereas the human brain can.

Yesterday at the Berkman thursday blog group meeting Adam spoke about how Luis Von Ahn and others have a new take on using CAPTCHAs called reCAPTCHAs. Apparently, the amount of time spent in solving these puzzles amounts to about 150,000 hours daily . So Von Ahn and his group at Carnegie Mellon figured a good way to put all this time to good use was to help the digital library project. What reCAPTCHA basically does is to collect all the words that fail to be recognized by the optical character recognition ( basically the computer algorithms that convert an image to text) from the digital library project and use them to authenticate users.

So now you are presented with two words, one that CAPTCHA knows the answer for and the other that is part of the reCAPTCHA database. So thanks to your being human you solve both words correctly and contribute one more word to the digital library projects book digitization effort.

Refs:
Adams Podcast interview
CAPTCHA on Wikipedia
About reCAPTCHA
Adam Weiss helps you get podcasting

The above image is a link to the reCAPTCHA website

Powered by ScribeFire.

Get podcasting with Adam Weiss’s help

Yesterday at the Berkman Thursday blog group meeting. Adam Weis from Bostons museum of science gave a talk about “all the cool and amazing things he does” and his experiences with podcasting, blogging, and other things web and digital.

Adam hosts and produces Boston’s museum of science podcasts and also his own Boston behind the scenes podcast. A veteran podcaster ( his science podcast is almost 100 episodes old) he also serves as a podcast consultant. Adam brought along all his podcasting gear and played us some of the samples from his website.

The session was a big eye-opener , I was amazed at how simple his equipment was, and impressed at the very professional results it produced. Also I was gladdened by his evangelical zeal and desire to make podcasting more accessible. I recommend his podcast consultant site for anyone looking to get started with podcasting.

The image above is a link from Adam Weiss’s equipment guide to podcasting and features the iRiver iFP-799 equipped with a $15 Giant Squid Audio Lab Mini Gold-Plated Omni Mic which is what he uses for most of his interviews.

Photosynth-seadragon and single particle imaging

As a a graduate student I had worked on a project where I used single particle imaging techniques to image the structure of a small viral protein. The protein particle fortunately has some symmetry, and using single particle image reconstruction techniques I could obtain a three-dimensional model of the particle from two dimensional projection images taken on an electron microscope.

After deepak got me hooked on to the TED talks , I caught a talk by Blaise Aguera Y Arcas on Microsofts new application called photosynth.

In the talk Blaise Arcas describes how they were able to put together a very high resolution almost three dimensional composite of the Notre Dame Cathedral assembled from tagged images on flickr.

Their software was able to accurately find the register for thousands of images from this tagged set and assemble it into the final composite. Check out the video above to get an appreciation of the complexity of the application. While I am hardly an expert in image processing, the algorithmic complexity of the application boggles my mind. Particularly impressive are the sections in the video where he talks about photosynth finding the register of images in the actual assembled composite despite them having people , hands and other obstacles obscuring the view of the cathedral.

I also caught some of the discussion on microsofts channel 9 on the technology. I sure would like to know the concepts they used to put-together such an amazing app. I also wonder of any of these concepts can help improve image reconstruction techniques in use in the single particle bio-imaging field.

Ignite Boston- loved the format.

Just got back from the first ignite Boston event organized by O’Reilly. Met a lot of really cool people and heard a few good talks.

It was just strange that quite a few of the talks ended up as blatant sales pitches for new companies or were very partially disguised so they didnt sound like sales pitches. Regardless..

The catchiest demo was the one on buzzword, a fully WYSIWYG flash based word processing app that ran inside the browser from a Boston based startup, Virtual Ubiquity. Rick Treitman the CEO demoed the app which is still pre beta. The user experience it promised was truly breathtaking. Buzzword made google docs looks like your grandpas word processing app.

Matt Welsh gave a cool talk on wiring cambridge with 100 sensors connected by a 802.11 wireless grid . The NSF and Microsoft funded city sense project would open all these sensors to the public and make the sensors programmable and make their data available to the general public. ALong these lines Brian Jepson gave a good talk about blanketing Rhode Island with wifi and the cool things you could do with the Make magazine electronics kit and the RI-WIns network.

The talk by Matt Douglas was about marketing and how to go about publicizing a new web or computer startup. Woven into his talk was a plug for Punchbowl.com which touts itself as a replacement to evite.

Though I am hardly a gamer, the talk by Jason McIntosh from volity was interesting in its approach to bringing open source components into the online gaming world.

Among the general interest and non sales pitch oriented talks ,Rod Begbie gave a fun talk about good presentations and Chris Brogans talk on using social networking to actually network effectively, was very nicely done.

Greg Raiz from Raizlabs and PicMe a photo sharing site also gave a very nice talk about his websites approach to data organization , His theory was that perfect organization is not always what we want ( or should want) .

The event was held in a crowded room where it was almost impossible to hear anything going on on stage if you were in the back.

On the whole the event was a lot of fun and I cannot wait for Ignite 2- Boston.

JavaFX examples?

This probably should just be a tumblr post, but I heard all the Buzz about JavaFx script on the Java Posse podcast.

All my googling skills failed to come up with a good page with examples that showed me how awesome it can be. Meanwhile I just came across Digg charts ( yes I am very much behind in my browsing) and the fact that the elegant metrics were all executed in flash makes me wonder whether I should substitute OR add in flex to my list of code groups to get around to learning.

Meanwhile,  I only wish someone could send me a link with metrics graphs done in Java that look as classy as the digg charts.

Screencast centric java eclipse blog

Trolling through the google search results for javafx the new “flash competitor” from sun , I came across this cool screencast centric code blog.

Its called thescreencast.com. The blog uses the wink tool to create audio-less screencasts which are in flash and show at a very good resolution on the browser page. The topics covered are java and eclipse centric.

Though I am a convert to netbeans , I like the overall style of the blog and the content featured. I have been playing around with wink and really like its feature-set. Maybe codeitch will feature some wink based screencasts while I attempt to converge on a platform to deliver my screencast centric code content  . The only problem is that since I don’t use my own hosting provider , its hard to embed my screencasts the way thescreencast.com  has. But maybe there just might be a way to have my cake and eat it too.