Feb 25

This post explains how to post multipart form-data to a URI using Apache Wink. For the sake of this tutorial, I will demonstrate how to upload an image to Facebook, using the Graph API. Note: there is no access_token information.

The code snippet below presents how to create the request entity for multipart/form-data posting:

1
2
3
4
5
6
7
8
9
10
File file = new File("/path/file.jpg");
String fileName = file.getName();

BufferedOutMultiPart requestEntity = new BufferedOutMultiPart();
OutPart outPart = new OutPart();
outPart.setContentType("application/octet-stream; name=" + fileName);
outPart.setBody(bytes);
outPart.addHeader("Content-Transfer-Encoding", "binary");
outPart.addHeader("Content-Disposition", "form-data; name=\"" + fileName + "\"; filename=\"" + fileName + "\"");
requestEntity.addPart(outPart);

Using the class org.apache.wink.common.model.multipart.BufferedOutMultiPart as the request entity, one can include as many parts as wanted. Each part is abstracted as an instance of org.apache.wink.common.model.multipart.OutPart, which can represent any kind of information. The code snippet below demonstrate how to add a text part:

1
2
3
4
5
6
7
8
9
String stringFieldValue = "stringField";
String stringFieldName = "stringValue";

outPart = new OutPart();
outPart.setContentType("text/plain; charset=us-ascii");
outPart.setBody(URLEncoder.encode(stringFieldValue, "UTF-8"));
outPart.addHeader("Content-Transfer-Encoding", "7bit");
outPart.addHeader("Content-Disposition", "form-data; name=\"" + stringFieldName + "\"");
requestEntity.addPart(outPart);

The code snippet below posts the multipart data to the server (uploading the image to Facebook):

1
2
3
String string = new RestClient().resource("https://graph.facebook.com/me/photos").
contentType(MediaType.MULTIPART_FORM_DATA_TYPE).
post(String.class, requestEntity);

The following output (headers + body) will be generated by Apache Wink and sent to the server:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
Accept: */*
Content-Type: multipart/form-data
User-Agent: Wink Client v1.1.2

--simple boundary
Content-Disposition: form-data; name="file.jpg"; filename="file.jpg"
Content-Transfer-Encoding: binary
Content-Type: application/octet-stream; name=file.jpg

...
--simple boundary
Content-Disposition: form-data; name="stringField"
Content-Transfer-Encoding: 7bit
Content-Type: text/plain; charset=us-ascii

stringValue
--simple boundary--

Pretty simple, huh? Let’s undestand how it works…

The class org.apache.wink.common.internal.providers.multipart.OutMultiPartProvider is provider responsible for writing the parts to the server. This class leverages the writePart and writeBody methods defined in org.apache.wink.common.model.multipart.OutPart class. Besides, it performs a very important step: including the boundary information in the request header, as you can see at line 2 in the above code…. no, wait a second, it is not there! The expected result (for line 2) is:

1
Content-Type: multipart/form-data; boundary=simple boundary

The default boundary value Apache Wink uses is ‘simple boundary‘. When the provider is executing (look at its writeTo method), it has two options: (1) use an already provided boundary value or (2) use the default one. When using the default one, it updates the request header in order to include the boundary information (generating the expected result as displayed in the box above). However, by the time the header is set, it has already been sent to the server (ie the connection was already established and the headers sent). So, considering the way it is implemented nowadays, the default boundary value will never work!

In the other hand, when you specify the boundary manually as a request header, you will guarantee the proper header is sent to the server and the multipart provider will use the provided boundary information. In order to do it, use instead the code below for posting the request:

1
2
3
4
String boundary = "simple boundary";
String string = restClient.resource(url).
header("Content-Type", "multipart/form-data; boundary=" + boundary).
post(String.class, requestEntity);

The multipart provider will use boundary parameter specified with the Content-Type header. Instead of using ‘simple boundary’, please follow the recommendation of RFC 1867: “a boundary is selected that does not occur in any of the data”.

Conclusion: Apache Wink’s current multipart implementation is incomplete. Either they make possible for providers to change the request headers prior to establishing the connection between client and server, or they make the boundary definition mandatory, throwing an exception when not present.

The following issue was raised to Wink’s JIRA: https://issues.apache.org/jira/browse/WINK-338

Tagged with:
Jun 20

I have recently started working with ScrumPM plugin for Redmine. Redmine is a flexible project management web application, while ScrumPM is plugin for Scrum projects management. It is a very interesting plugin, providing capabilities like: user stories backlog, sprint and task management, burndown charts. This post is based on ScrumPM version 0.1.4.

Internally, tasks are mapped to Redmine’s issues, sprints are mapped to Redmine’s versions. However, user stories aren’t mapped to any know entities. ScrumPM stores these information in a database table called user_stories.

In the project I am working in nowadays, I have the following projects hierarchy:

  • Main project
    • Subproject 1
    • Subproject 2
    • Subproject 3
      • Subproject 3.1
      • Subproject 3.2

ScrumPM provides an independent user stories backlog for each project. By default, the first user story created in project Subproject 3.2 will have id 1. In project Subproject 2, user story ids will start from 1 as well. There is even another problem: if you have two stories (ids 1 and 2) and remove story 1, your next story will have id 2 and you will end up with duplicated user story ids.

A user story id must be unique within all related projects. So, my goal here was to provide a unique user story id generation process for all sub-projects of Main project. Since I don’t know anything about Ruby yet and didn’t have enough time to learn it, I went through a SQL approach.

In order to achieve my goal, my first step was understanding ScrumPM’s source code. The following code snippet contains plugin code to generate a new user story id (file controllers/user_stories_controller.rb):

43
44
45
46
@user_story = UserStory.new(params[:user_story])
@user_story.project_id = @project.id
last_us =UserStory.find(:first, :conditions => ["project_id = ?",@project.id], :order => "us_number DESC")
@user_story.us_number = last_us.nil? ? 1 : last_us.us_number + 1

As you can see, line 45 searches for project’s last created user story. If there are no stories, id one is used (as in line 46). Otherwise, next integer is used as id.

My second step was looking for master project (Main project). Projects hierarchical data are represented by Redmine in database using a nested set model (you can find more information about this model in this MySQL article). The following SQL query searches for master project (parameter ? should be project.id):

1
SELECT parent.id FROM projects AS node , projects AS parent WHERE  node.lft BETWEEN parent.lft AND parent.rgt AND node.id = ? AND  parent.parent_id IS NULL;

The third step was being able to list all related projects (Main project‘s sub-projects). The following SQL query does it (parameter ? should be Main project‘s id):

1
SELECT node.id FROM projects AS node, projects AS parent WHERE node.lft BETWEEN parent.lft AND parent.rgt AND parent.id = ?

Having all sub-projects listed, the next natural step was listing all user stories for related projects and getting next user story id. The following code snippet contains the complete solution:

1
@user_story.us_number = ActiveRecord::Base.connection.select_one('SELECT IFNULL(MAX(us_number), 0) + 1 AS count FROM user_stories WHERE project_id IN (SELECT node.id FROM projects AS node, projects AS parent WHERE node.lft BETWEEN parent.lft AND parent.rgt AND parent.id = (SELECT parent.id FROM projects AS node , projects AS parent WHERE node.lft BETWEEN parent.lft AND parent.rgt AND node.id = ' + @project.id.to_s + ' AND parent.parent_id IS NULL) ORDER BY node.lft)')["count"]

As you can see above, my solution will get the next user story id taking in account all existing sub-projects backlogs. Besides, if you remove any story, you will not be caught in a duplicated id situation.

Tagged with:
Feb 07

Continuing the first post, I am now going to discuss TrueType fonts manipulation and explain the transparency/opacity mechanism used by SDL. Besides, I am going to present the audio and events management. You should download sdl_part2 before beginning.

TrueType fonts manipulation

The SDL_ttf library provides TrueType fonts support. Proper initialization is required before usage:

1
SDLTTF.init();

The first step is font loading. SDL_ttf supports two font formats: TTF and FON. The included “font.ttf” file is loaded by:

1
SDLTrueTypeFont font = SDLTTF.openFont("font.ttf", 16);

As you can see above, the openFont method has a second parameter: font size. You have to choose the size of your loaded font instance. This size cannot be changed. If you need a different size, then you will have to load the same font again, creating a new instance with a different size.

Now we are going to create and draw a string in the screen:

1
2
3
4
SDLColor textColor = new SDLColor(0, 0, 255);
SDLSurface textSurface = font.renderTextSolid("SDL_ttf: Hello world!", textColor);
textSurface.blitSurface( screen, new SDLRect(0, 350) );
screen.flip();

The first line defines a blue color. The second line renders the “Hello world” string into a surface. I don’t now if you remember my last post, but I repeat it again: everything in SDL is represented as a surface. The last two lines blit the text surface and the screen surface together.

Transparency manipulation

If you open our application right now, you will realize our sprite (loaded image) contains a very strange purple color. Before start talking about transparency, I will first show you why it is necessary. Not just necessary, but mandatory! The code snippet below will draw another instance of our previous loaded sprite:

1
2
spriteSurface.blitSurface( screen, new SDLRect(310, 250) );
screen.flip();

Running the application you will see the new instance of our sprite inside a red rotated rectangle (previously drawn). Now, I don’t expect the lower surface (rectangle) to be entirely covered by the sprite surface. I don’t even want to see that purple color anymore! However, that purple is very important: it is our key color! It indicates to SDL which color will be used for transparency effect. Now, instead of blitering the sprite surface as we did above, let’s try this way:

1
2
3
4
long colorKey = SDLVideo.mapRGB( spriteSurface.getFormat(), 255, 0, 255 );
spriteSurface.setColorKey(SDLVideo.SDL_SRCCOLORKEY, colorKey);
spriteSurface.blitSurface( screen, new SDLRect(310, 250) );
screen.flip();

We need transparency because our sprites usually will contain unutilized area. For instance: if you need to draw an opened door, the passage will have to be transparent. Otherwise you won’t be able to see through that door!

There is a lot more to discuss about transparency and blit techniques. However, we are going to have this conversation when I start talking about my isometric engine (next post).

Audio manipulation

The SDL_mixer library provides audio support. The following code snippet initializes the library before usage:

1
SDLMain.init(SDLMain.SDL_INIT_AUDIO);

After initialization, we need to initialize the mixer and open an audio channel:

1
2
SDLMixer.openAudio(44100, SDLMixer.AUDIO_S16, 1, 1024);
SDLMixer.volume(-1, 100);

The first line creates our mixer: 44.100Hz, 16bit, ONE channel, 1024bytes per output sample). We are creating only one channel here, but you can have several. The second line sets the maximum volume for all channels. Again, you can specify distinct volumes for each channel. The next code snippet loads and plays an audio effect:

1
2
MixChunk effect = SDLMixer.loadWAV("building.wav");
SDLMixer.playChannel(-1, effect, 0);

The first line loads the audio effect file. SDL_mixer supports the following audio effect file formats: WAVE, AIFF, RIFF, OGG, and VOC. Then, the second line plays our audio effect in the first free channel (-1 as first parameter). The last parameter indicates how many times the audio effect should be repeated (0 is none).

Music manipulation

Besides audio effects, SDL_mixer provides separated functions for music playing. Music and audio effects are played independently. You only need to initialize SDL_mixer once (already done by now). However, you have to set up the music volume:

1
SDLMixer.volumeMusic(40);

The next step is load play the music. SDL_mixer supports the following music file formats: WAVE, MOD, MIDI, OGG, MP3. The first line of the next code snippet loads a music file, while the second line plays the music forever (-1 as second parameter indicates infinite loop).

1
2
MixMusic music = SDLMixer.loadMUS("Cecil_Gant-I_Wonder.mp3");
SDLMixer.playMusic(music, -1);

Event manipulation

SDL provides support for mouse, keyboard and joystick events management. Now, instead of making our application sleep for a while before finalization, we will wait for ESCAPE key to close the application:

1
2
3
4
5
6
7
8
for (;;) {
  SDLEvent event = SDLEvent.waitEvent();
  if (event instanceof SDLKeyboardEvent) {
    SDLKeyboardEvent keyEvent = (SDLKeyboardEvent) event;
    if ( keyEvent.getSym() == SDLKey.SDLK_ESCAPE )
      break;
  }
}

The event initialization procedure is included in the SDL video one. One call initializes both systems. The code snippet above contains one endless event management loop. In this initial example, we are waiting for a keyboard event, specially the ESC key. Line 3 blocks until a new event is available. Line 4 checks the event type. After a break (line 8), the main thread will end and our application will be closed.

Now, let’s do something far more interesting: write the pressed key code in the screen (left-top red box). To achieve this, we will add an else-statement to our if-statement, like this:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
for (;;) {
  SDLEvent event = SDLEvent.waitEvent();
  if (event instanceof SDLKeyboardEvent) {
    SDLKeyboardEvent keyEvent = (SDLKeyboardEvent) event;
    if ( keyEvent.getSym() == SDLKey.SDLK_ESCAPE ) {
      break;
    } else {
      redSurface.blitSurface(screen, redRect);

      String keyString = "Key: "+ keyEvent.getSym();
      SDLColor keyColor = new SDLColor(0, 0, 0);

      SDLSurface keySurface = font.renderTextSolid(keyString, keyColor);
      keySurface.blitSurface(screen);
      screen.flip();
    }
  }
}

There are several other SDL events you can play with. I advice you to try out by your own SDLMouseButtonEvent and SDLMouseMotionEvent at least.

So far

So far I have gave you a wide SDL overview. You are now able to develop very interesting applications with SDL.

Next post

In the next post I will talk about my isometric engine. It offers several layers of abstraction on top of SDL. You can manipulate Java objects (like buildings and people) in a very high-level manner. I will present the engine’s architecture and discuss SDL weaknesses and how did I address them.

Tagged with:
Feb 04

Simple DirectMedia Layer (SDL) is a cross-platform multimedia library designed to provide low level access to audio, keyboard, mouse, joystick, 3D hardware via OpenGL, and 2D video framebuffer.

The library is very powerful, however it contains only low-level functions. In order to extend SDL features, several libraries were created. The main extensions are:

  • SDL_gfx: graphics primitive toolkit and rotozoom
  • SDL_image: an image file loading library
  • SDL_ttf: a TrueType fonts toolkit
  • SDL_mixer: a multichannel sample and music mixer

There are several other libraries providing abstractions and extensions for SDL, but I believe the libraries above are the basis for any development using SDL.

Since SDL is written in C, we need a JNI interface for Java usage. Ivan Ganza created sdljava, which is a binding to the SDL API. It provides the ability to write games and other applications from the Java programming language. It is designed to be fast, efficient and easy to use.

But why Java?

SDL is a library, not an engine. There are no abstractions or special features like event handling, thread support, or collision detection… SDL only offers low-level access to multimedia resources. In fact, SDL is not thread safe at all!

Java3D applications development is complex and not very intuitive. We can design our games using Java objects, develop our game business logic using Java, leverage the Java API features.

Cross-platform games development is possible combining Java and SDL together.

SDL download

You can download manually all required files or download my package here: sdl_part1.

SDL initialization

Create a Java project using your favorite IDE (I always recommend Eclipse). You have to set up the classpath and library classpath for your project. My package includes a Eclipse project already configured. Please refer to file ‘.classpath’ for guidance.

After the initial set up, we can create our first class and initialize the SDL library.

1
2
SDLMain.init(SDLMain.SDL_INIT_VIDEO);
SDLSurface screen = SDLVideo.setVideoMode(800, 600, 16, SDLVideo.SDL_DOUBLEBUF | SDLVideo.SDL_FULLSCREEN);

The first line initializes the SDL library (it should be ran only one time and before using any SDL function). The second line creates a surface that represents the screen (width: 800; height: 600; 16bit; double buffered and full-screen mode). In SDL, everything is mapped to a surface, including images, geometries, texts, etc. If you try to run only this piece of code, you probably won’t even see the screen surface. We need something for keeping the main thread alive. You can use your imagination for now: infinite loop or sleeping.

Surface manipulation

As said before, everything in SDL is mapped to a surface. Now, let us create a new surface, using the same properties as the main screen.

1
2
3
4
SDLPixelFormat format = screen.getFormat();
SDLSurface redSurface = SDLVideo.createRGBSurface( screen.getFlags(), 100, 100, format.getBitsPerPixel(), format.getRMask(), format.getGMask(), format.getBMask(), format.getAMask() );
long redColor = SDLVideo.mapRGB( redSurface.getFormat(), 255, 0, 0 );
redSurface.fillRect(redColor);

SDL library provides only primitive drawing functions. The last two lines are used for surface filling, using red color. So far we have the screen surface and another red-filled surface. The next step now is combining these two surfaces.

1
2
SDLRect redRect = new SDLRect(0, 0);
redSurface.blitSurface(screen, redRect);

The first line specifies a location. In our case, it specifies the location (x, y) of our new surface within the screen surface. The second line performs the blit operation. A blit operation is the combination of two bitmaps (surfaces). Ok, now let’s run the application…

You were supposed to see the red surface within the screen surface, right? Wrong! The question now is: where is my red surface? The answer is on the next code snippet.

1
screen.flip();

We forgot to flip the screen! The flip SDL function has two behaviours: (1) redraw the entire surface, or (2) flip the screen when using double buffer. The final result is the same: the flipped screen is updated. Let’s try running our application again… Now you can see the red surface on left-top of the screen (0, 0).

Advanced surface manipulation

The SDL_gfx library (represented by SDLGfx class) provides advanced surface manipulation. The following code snippet performs the drawing of a green filled circle and a blue border-only rectangle on the screen surface. The colors are being represented as RGBA (red, green, blue, alpha).

1
2
3
SDLGfx.filledCircleRGBA(screen, 200, 150, 50, 0, 255, 0, 255);
SDLGfx.rectangleRGBA(screen, 0, 200, 100, 300, 0, 0, 255, 255);
screen.flip();

Another advanced feature provided by SDL_gfx is image zoom and rotation. The next code snippet will use our first red surface as basis for modification. We are going to rotate it 45ยบ and scale it two times bigger. These modifications can be done to any SDL surface and will generate a new surface.

1
2
3
4
SDLSurface changedRedSurface = SDLGfx.rotozoomSurface(redSurface, 45, 2.0, true);
SDLRect changedRedRect = new SDLRect(300, 200);
changedRedSurface.blitSurface(screen, changedRedRect);
screen.flip();

Image manipulation

SDL library provides basic bitmap image loading functions. The following code snippet shows how to load bitmap images. The loading process generates a surface.

1
2
3
4
SDLSurface spriteSurface = SDLVideo.loadBMP("centro_atletismo.bmp");
SDLRect spriteRect = new SDLRect(300, 0);
spriteSurface.blitSurface(screen, spriteRect);
screen.flip();

After running the application, you’ll probably want to ask me: what is the purple color history? And why is the image dotted? I know the answers, but now ain’t time to explain it. We will discuss transparency in my next post.

Instead of using the basic bitmap image loading function, you can use the SDL_image library. This library provides support for several image formats, including PNG and GIF. The only difference between standard SDL and SDL_image is the first line of the previous code snippet. I’m using the same bitmap file, but you can test using your own files. The following code snippet presents the image loading process using SDL_image library.

1
SDLSurface spriteSurface = SDLImage.load("centro_atletismo.bmp");

Extra information

You can found a lot more information in SDL documentation.

Next post

In the next post I will present the TrueType fonts manipulation and I will explain the transparency/opacity mechanism used by SDL. Besides, I am going to present the audio and events management.

Tagged with:
preload preload preload