I have just made my first two mxj objects for Max, using Twitter4j, a java library for the Twitter API. You can filter the live Twitter stream by using keywords, users or locations. I made this for a sonification project I am working on at the moment. Next to the live stream you can also search twitter data, view timelines, direct message friends and some other cool stuff. This is my first time using Java (next to some stuff made in Processing), most of the code is a slightly modified version of the Twitter4j examples. The mxj objects where made in Eclipse, you can find a short tutorial on how to set up Eclipse to start developing mxj objects here, and check the WritingMaxExternalsInJava.pdf located in the Max applications folder under java-doc. The idea of combining Twitter stream location data with a world map came from this project. If you want to use this code you have to get a Twitter Consumer key/secret and access token/secret, and put it in your java code at the ConfigurationBuilder stuff, highlighted in the code below. Maybe i will implement this into the code in the future to streamline this process, but for now you can find some info on how to get these keys on the Twitter developer forum.
Here is a short video:
[update 26-09-2013] :
Added Sentiment Analysis, with LingPipe, which is a toolkit for processing text using computational linguistics. Tweets are analyzed and categorized, the dot on the world map, representing the location of the Tweet, wil have a corresponding color (positive=blue/neutral=green/negative=red). Amount of pos/neu/neg Tweets are counted (so you can see the overall sentiment on topics or area’s). (Only works on Tweets written in English)
Added a daylight layer, which is a world map showing day/night around the globe and cloud coverage. You can change the opacity of the normal (grey scale) map, and the opacity of the daylight map.
Added search location by name rather than bounding box coordinates. Location names are converted to bounding box coordinates using the website: http://isithackday.com/geoplanet-explorer/ For this to work I created an extra MXJ object that imports a websites source into Max.
Media Technology MSc. is hosting an exposition called Transition. There will be seven different installations, each with its own theme. Together with Guido Huijser and Rick Henneveld we’ve composed a soundwalk on the theme of Separation. The soundwalk is specifically made for the Raamsteeg 2 building in Leiden. The title of our work is also Separation.
Opening Lecture: Thursday February 7: 16.00
Opening Exhibition: Thursday February 7: 17.00
Exhibition: February 8, 9 10 and 14, 15, 16: 14.00 – 20.00
Device I built together with Guido Huijser for the course Sound, Space & Interaction at Media Technology. The device uses an Arduino Duemilanove, Parallax 28440 RFID reader/writer, Parallax RFID tags, Korg Nano Kontrol and a Macbook running Max 6.
The idea is based on the old kijkdoos; which is a sort of shoebox diorama.
The user can arrange his/her own sound collage by placing RFID tag card’s onto the glass plate, each card corresponds to a specific field recording. The user can adjust each sound its volume, stereo placement, playback rate and playback direction. 8 sounds can be played simultaneously. The user has the ability to record his/her own soundscape, which will then be uploaded to a server. By entering an email address, the user can receive a download link to their sound file.
We’ve recorded most of the sounds with the Soundman OKM II studio klassik solo binaural microphones. And some sounds with the built-in stereo microphone from the Zoom H4 portable recorder. The device uses two headphones so people can enjoy creating soundscapes together.
Here is a little sound sample: