There has been an incredible amount of human effort put into building amazing things in Minecraft. Some players build huge structures on their own while others band together in communities to make something truly impressive together. One of the most dedicated such communities I know of is the group that has been building Broville v11 over the past few years. Last week their efforts finally became publicly available with the release of Broville v11. You can download it here.No Comments on Broville v11
In this post I will show how the plugin system in Chunky is being implemented.
Plugin support is surprisingly easy to add in a Java application thanks to the Java reflection API. Java reflection allows runtime type introspections, and dynamic code loading. Some of the more exotic uses cases for Java reflection is to generate executable code during runtime. A compiler hosted on and targeting the JVM could even use reflection for compile-time code execution. You don’t need to build anything nearly so exotic to implement a simple plugin system.No Comments on Java Plugin Architecture
I am home again from California, and my plants survived! This post is a collection of updates on some of my projects that I have posted about here on my blog before.
I have started coding on Chunky again, and I tried livestreaming some of my Chunky coding on Twitch. I streamed last evening from 20:00 to 24:00 CEST, and I’m planning to stream around the same time the rest of the week. Feel free to ask questions in the chat if you join my stream!
In Chunky I am working on the UI code, trying to finish up the new JavaFX UI. Here is a screenshot from last evening while I was working on a new Color Picker dialog:
My current Chunky development priorities are:
- Finish the new JavaFX UI.
- Add support for Minecraft 1.10 blocks.
- Add support for custom block models.
- Plugin system.
Plant Watering System
My automatic plant watering system was a success. My plants were completely alone in my apartment for about 5 weeks, and the watering system kept them alive. Here is the moisture graph from one of my plants:
I made a timelapse video of my plants using a Raspberry Pi camera. The video shows one image per day, taken at the same time of day, and the images cover about one and a half month of real time. My fiancée came home and moved the camera a bit at the end of the video.
As mentioned I was in California for work again this summer. Here are some pictures:No Comments on Project Updates
I recently bought the HTC Vive, and after I converted my living room into a Virtual Reality cave I have been trying out lots of VR experiences.
The best thing about the Vive is that it has a very accurate motion tracking system for the head mounted display and the motion controllers. The most impressive game I tried so far is a bow-and-arrow minigame in Valve’s The Lab. In the game you are tasked with defending a gate from oncoming attacker stick figures by using a bow and arrow. You pick the bow in one hand and hold an arrow in your other hand. It feels very intuitive to shoot arrows using the virtual bow, and I became very immersed in the game feeling like I was almost there in the virtual game world.
Too little content
Sadly most of the games available for the Vive so far have very little content. Two of the best paid games available now are Space Pirate Trainer and Hover Junkers, but neither of them has much to offer in terms of gameplay. In Space Pirate Trainer you stand on a platform shooting flying robots that attack in waves – there is no enemy variety and you play on the same stationary platform with no other levels available, so even though the shooting action is cool it does get boring quickly. Hover Junkers does not have much more content – it similarly has a mode where you shoot flying robots attacking in waves, with slightly more enemy variety, there is also a mode where you fly around a small arena shooting at other players in a multiplayer death match.
I was very excited about trying racing and flying games in VR. It seemed like the perfect fit for the platform – ideally the VR features would just make it more immersive. So far there seem to be very few racing games with VR support. Project Cars is one game, and DiRT Rally another (for the Oculus Rift). I tried Project Cars and it made me slightly motion sick after a few races. It also has a problem keeping a high framerate, which I’ve noticed is very important for reducing motion sickness and keeping immersion. I think VR might work much better with slower paced driving games like Euro Truck Simulator 2, since it probably won’t make you motion sick as quickly.
Graphics, framerate, latency
Having high fidelity graphics does not seem at all important for VR immersion. The stereoscopic graphics make everything seem real, even if the textures have very low resolution or even if the 3D models have few polygons. High and consistent framerate and low latency matters much more for VR immersion. As soon as the framerate dips you instantly notice it, and worst of all it can make you feel motion sick. Latency is equally important – while looking around in VR it is very easy to notice the delay between your head movement and the world rotating around you. The Vive seems to handle motion tracking latency just about right, making the latency unnoticeable – at least to me – when the game I’m playing is running smoothly. It is similarly important to have fast tracking and low latency visualization of the motion controllers. When everything is tracked with low latency I can really believe that I’m swinging a lightsaber in front of me.
Speaking of lightsabers, there is a free VR demo on Steam called Trials on Tatooine. The game is very short – about 5 minutes – and has lots of framerate and latency problems. It seems like ILM tried to make it look very pretty before trying to make it run decently. During some short periods it ran okay, and the lightsaber felt very responsive and cool, but even though there is very little stuff happening on screen it still manages to tank the framerate and have terrible latency most of the time. Trials on Tatooine is an excellent example of how poor the experience gets when you sacrifice framerate and latency for graphics.
Tilt Brush is an interesting VR tool for painting in 3D, though painting 3D lines does not seem very useful to me. I’d rather use something more like ZBrush to sculpt in VR. VR could be a huge win for authoring 3D content. The industry standard 3D modeling tools are, as far as I know, all limited to using 2D screens for drawing shapes – I can imagine that many new 3D modeling tools will be made specifically to exploit VR.
Vive’s Head Mounted Display
The Vive has a 2160×1200 pixel display – each eye seeing roughly 1080×1200 pixels. When I use the display there is noticeable blurring when I look up or down, outside the center of the eyepieces. It’s also surprising how few pixels make out distant objects. The first times I used the HMD I felt like I had become short-sighted because I couldn’t make out distant objects clearly – it was simply because there are not enough pixels for faraway things.
There is also an effect that people describe as the “Screen Door” – it’s like a slight mesh pattern in your vision caused by the pixel grids in the display. I got used to that very quickly, and it really doesn’t detract that much from the experience. A more noticeable effect of the pixel grid is very pronounced false chromatic abberation from the pixel geometry of the display. The effect is not noticeable in the center of vision but quite noticeable near the edges of my vision where the lens blurring enhances it.
All in all, things that are reasonably close and near my viewing direction are rendered very nicely. I have not tried the Oculus Rift so I can’t say how the displays compare.
Developing for VR
I’ve become interested in coding something for the HTC Vive now that I’ve tried some of the existing games. I’m not entirely sure what I’d make yet, but one idea was to make some kind of VR exploration tool for Minecraft. It would be very cool to be able to fly around a Minecraft world in VR.
VR is very cool, but it’s overpriced and there are too few games for it right now.No Comments on My thoughts on Virtual Reality
I am away from home for three months. I wanted a way to ensure that two of my plants could survive while I was away, without a human watering them for a long time.
I built a small system to automate the plant watering process. Here is a picture of the system in action:
Some nice features of my watering system are:
- Watering is done on a fixed schedule, but I can log in remotely to adjust the schedule.
- Soil moisture is logged and moisture graphs are periodically generated.
- A camera takes a picture of the watering system twice an hour.
- Surveillance pictures and moisture graphs are uploaded to a website so I can easily check the status of the system.
Here are examples of the moisture graphs that my system generates:
The moisture graphs are generated using the R programming language and the excellent
ggplot2 library. Here is a snippet of the code that generates one of the graphs above:
tod <- function(x) as.numeric(format(x, "%H")) + as.numeric(format(x, "%M"))/60 ggplot(moist, aes(x=time, y=m1)) + geom_point(aes(color=tod(time))) + scale_color_gradient(low='blue', high='orange') + labs(title='Plant 1', y='moisture')
One of the challenges of designing the watering system was that I had not done any hobby electronics projects before, so I had to learn a bit of electronics basics and refresh my memory from my undergrad electronics class.
Arduino and Raspberry
I used an Arduino Uno to prototype the water pump control. I had a lot of fun using the Arduino, and I ended up doing several other projects with Arduinos as well, such as reverse engineering an IR remote control for my home audio system.
The final watering system uses an Arduino to control a submersible water pump, and to measure soil moisture using two moisture sensing devices. The Arduino uses a USB serial port to listen for commands from a Raspberry Pi which has scripts that determine the watering schedule. This setup keeps the code on the Arduino really simple, and allows me to tweak a lot of the logic in the system by logging in to the Raspberry Pi via a remote shell. I could even reprogram the Arduino remotely if I wanted to.
The Raspberry Pi has a camera for taking the surveillance pictures. I’m planning to use the images from the camera to compile a time lapse video later on.
Here is a closeup of the box that sits on top of the water reservoir. It contains the Arduino and a prototype board with the power switching circuit for the pump:
I have only one water pump that waters two separate plants. To make this work I have a regular garden hose divider to split the lab tubing and water both plants at the same time. I tweaked the length of the tubes that lead to the two plants to adjust the amount of water going to each plant. To make adjustments without drowning my plants I used plastic bags to collect the water while I was tweaking the system. Collecting the water in plastic bags let me measure exactly how much water each plant was receiving.
The current watering system is not flexible and relies on the two plants consuming relatively the same amount of water. If I decide to improve the watering system in the future I will probably install solenoid valves so that I can water each plant independently of the other. This would make the system more reliable for long-term use and make it possible to use a separate control loop for each plant.
Flooding was a big concern when I designed the system, so I placed each plant in a big plastic container large enough to collect all water in the reservoir in case a catastrophic failure should happen.
The towel under the water reservoir is there to catch condensation from the outside of the reservoir. I did not notice any condensation problems while the system was in trial mode before I left. Condensation could be a problem if the room temperature changes faster than the reservoir temperature. One other option I considered was to place the reservoir in another container like the one the big plant is standing in.
As mentioned, I would like to have individual plant watering using solenoid valves. This could also make the system modular – I could add extra plants to the water loop and have them individually watered just by adding an extra piece of tubing and a valve for each plant.
Another way of improving the system is to add moisture feedback in the control loop, and use e.g. a PID regulator to control the water pump. It is probably good to have oscillating moisture levels, at least that seems more natural for the plants, but I’d like a regulator that ensures that the three-day average moisture stays the same over a long time.No Comments on Automatic plant watering system
In my previous post I mentioned the idea of using JavaFX for the Chunky GUI. Chunky currently uses the Swing toolkit for its GUI, and JavaFX is aiming to replace Swing eventually. I was interested in using JavaFX because it seems to have better cross-platform consistency. I have now done some exploratory development using JavaFX 8, and I want to share my thoughts so far.
What JavaFX does right
Consistency between platforms is the reason I wanted to try JavaFX. I made a clone of part of the Chunky GUI in JavaFX and tried running it on Windows 10, OSX, and Linux, and it looked the same on each OS. That’s awesome, it’s exactly what I wanted. Here is a screenshot of the GUI running on Windows 10:
Another great thing about JavaFX is that it’s possible to separate layout from behavior. In Swing the only way to build a GUI was to write code that constructs each element manually. There did exist layout editors for Swing, but they all behaved differently and they generated ugly code that got mixed with the application code. Using Swing layout editors was sometimes a useful first step when building a GUI, but I always ended up having to code everything by hand because after I had made changes to the generated code the layout editor no longer could edit the layout. I really hate working with Swing layouts by manual coding but that’s what I had to do mostly.
JavaFX solves the layout editing problem by using XML files to define the layout. The layout is edited with a program called Scene Builder. Separating the layout from the code in this way means that you can continue using the layout program even after you edit the code – because they are separate. It is still possible to edit GUI elements via code if you need to, but it no longer has to be the main way to construct a GUI. Graphical editing is so much better for building GUIs! Here is a screenshot of the Chunky GUI in Scene Builder:
Although it’s nice to remove the manual coding of GUI layouts, it is of course necessary to wire up the GUI components to code that can make them do useful things. Coding with JavaFX is easier than with Swing thanks to a much nicer API. For example:
- JavaFX uses functional interfaces for all callbacks which makes it easy to set up callbacks with lambdas in Java 8. In contrast Swing callbacks are grouped into listeners with multiple callbacks per listener.
- Listening for modifier keys is supported in JavaFX with proper focus handling. In Swing you have to handle modifier keys for the whole application.
- It is very easy to bind variables in the application to JavaFX GUI elements by wrapping the value in a JavaFX property.
A few other points about JavaFX that I like:
- Customizable tooltips that can be built individually and placed anywhere.
- Much improved color pickers.
- Simpler file chooser API.
Things JavaFX needs to improve
My overall impression of JavaFX so far is quite positive, but I do have some gripes about JavaFX which I hope get fixed in the future. Below are three things that I hope will be changed/improved.
API additions between major Java releases
Some new API methods have been added to JavaFX 8 after the Java 8 release. For example you can call
ChoiceBox.setOnAction() in Java 8 update 60, but it won’t compile or run on older Java 8 version. Adding things like this in Java updates means developers like me can end up unknowingly relying on a specific Java 8 release. I hate this and it’s a stupid idea. One of the cornerstones of Java is backward compatibility, and it should not be jeopardized for small API improvements. APIs must be stable so that you can rely on them. There is nothing more annoying than compiling a Java 8 program, and then trying to run it on another machine with Java 8 installed only to find out that it doesn’t work!
No image scaling without interpolation
JavaFX provides a
drawImage method that can scale images, but it does so by using a blurring interpolation mode and there is no way to change the interpolation mode. This causes undesirable blurring when the image is enlarged. In Swing this worked just the way I wanted, but in JavaFX I have to implement my own scaling code just to fix this annoying behavior.
Feedback is very important in a user interface. The feedback needs to be delivered as fast as possible. In JavaFX when I hover over a button it takes a short while before the highlight that indicates the button is hovered appears. This may seem like a nitpicky complaint, but it is important to me that GUIs are responsive and right now JavaFX on Windows does not feel responsive enough. On the other hand I have noticed a huge performance regression in Swing based GUIs on Windows 10 recently making them lag ridiculously much. At least JavaFX is a little better than Swing on Windows 10 right in that respect.No Comments on My Thoughts on JavaFX