Feeds:
Posts
Comments

Archive for January, 2008

In my day job I work on providing systems for remote monitoring and control of devices, and providing interfaces to and analysis of device data. When starting a project, there’s a great deal of flexibility as to what language you’re going to use, what front-end tools and architecture, etc. It gets you into the mindset where every new technology is potentially useful in the M2M world. So, when I heard that LSL scripting in Second Life not only allowed you to make HTTP requests and receive useful data, but also could show video, what else could I do but use SL as a control interface for something in real life?

So what’s going on here? There’s several pieces at work:

  • A script running in Second Life which can make HTTP requests to a web server
  • A live video stream from a webcam pointed at the Nabaztag to Second Life
  • A web server receiving requests and communicating with the Nabaztag to activate it

Scripts in SL are written in Linden Scripting Language, or LSL. They are associated with objects (i.e. doors, spheres, clothing, etc.) and are state-based — that is, the bulk of the script writing involves reacting to changes in the state of the object and/or changing the state yourself (it’s very FSM-ish). The code to remotely control the Nabaztag was pretty simple, and I’ll include it here:

string requestid;
string SOURCE_URL = "http://XX.XX.XX.XX:7080/control.php?command=";
string CMD_ON = "on";
string CMD_OFF = "off";
integer isOn;

    change_visual(integer on)
    {
        llOwnerSay("Now showing light currently on? " + (string)on);
    }

    control_light(integer on)
    {
        string url = SOURCE_URL;
        if(on) {
            url = url + CMD_ON;
        }
        else {
            url = url + CMD_OFF;
        }
        requestid = llHTTPRequest(url,[HTTP_METHOD,"GET"],"");
    }

default
{
    state_entry()
    {
        llOwnerSay("Bunny toggle script restarted, turning light off.");
        control_light(FALSE);
        isOn = FALSE;
    }

    touch_start(integer total_number)
    {
        llOwnerSay("light currently on? " + (string)isOn);
        integer newVal = isOn ^ 1;
        llOwnerSay("Setting light to " + (string)(newVal));
        change_visual(newVal);
        llOwnerSay("Calling web service for control");
        control_light(newVal);
        llOwnerSay("Request made, id = " + requestid);
        isOn = newVal;
    }

    http_response(key request_id, integer status, list metadata, string body)
    {
        integer i;
        if (request_id == requestid)
        {
            llOwnerSay("Platform Response: " + body);

        } else {
            llOwnerSay((string)status+" error");
        }
    }
}

Pretty readable, IMHO. Some variable declarations, then a couple of utility functions for providing local output when the light changes status (aka the bunny goes on or off) and for making the HTTP request itself and then we define our first (and only state), the ‘default’ state. We enter this automatically when the script is loaded, and from here we respond to two events — either we are activated (‘touched’, in SL parlance), which results in a web call, or we are receiving the result of a previous web call in the http_response function (shown here just for debugging purposes, but you could definitely act upon it). All in all, pretty short, and not too complex, especially in light of the fact that what we’re really doing here is taking in user (avatar) input and causing a remote device to be controlled via a web service.

The live video stream wasn’t too hard. For streaming video you generally need two different applications, a broadcaster that takes the video from the webcam and encodes it, and a streaming server that handles sending the encoded stream out to all the requesters. I’d never streamed video before, but with the aid of a tool called Vara Wirecast that includes both a broadcaster and a small streaming server it wasn’t too hard to get it set up to stream video in Quicktime format (required by Second Life, as discussed here).

You then just have to go into Second Life and go to the Media tab of the About Land window for your land (you must own land in SL to stream video to it). You provide it a URL for the video stream, and also pick a texture to be your ‘Media Texture’. Any object displaying this texture will have the video stream shown wherever the texture appears — for this reason, it’s probably a good idea to pick a texture that you haven’t already used everywhere in your land. You can then play the video from within SL!

As for the web server, a friend of mine wrote that so I won’t go into much detail there, but suffice it to say it was a pretty standard web app implementation, accepting on/off commands through a RESTful URL.

All in all, it was a lot of fun to play around with not only LSL but also live video streaming, and it’s always nice to get some work in on the client side, where your changes (and bugs) are immediately visible.

A higher-level description of what I got out of the experience is posted on my company’s blog.

Read Full Post »

Texas governor's ballotAs if poorly-implemented electronic voting machines weren’t enough, Mother Jones has a good interview with William Poundstone about the problems inherent in the ‘vote-for-one-candidate’ style of voting, aka plurality voting, that has been the implementation of our democracy for as long as it’s existed. It’s not totally broken, of course — it works great when you have only two candidates, or multiple candidates whose supporters don’t overlap… but anyone who remembers the last few presidential elections can see how a single third-party candidate can effectively ‘steal’ votes away from a frontrunner (like Nader in 2000). The ‘best’ system (according to Poundstone) is basically the ‘Hot-or-not’ system, rating each candidate from 1 to 10… but almost as good is the approval system, where you basically have a box to check for each candidate saying whether you would ‘approve’ of them as president. Both of these seem better to me than the instant-runoff system that I had heard was being considered (in which you can rank your choices and in each iteration, the lowest ranked candidate’s votes are given to each voter’s next-best choice until a candidate has a majority).

Photo by Cosmic Jans 

Read Full Post »