Jan 052009
 

After doing quite a bit of C++ recently, I thought I would post my method for getting the current system time in milliseconds in C++ for both Mac OS X and Windows. The Mac version might translate to other Unix platforms, but you’ll have to check the docs or man pages.

Mac OS X

This actually returns a struct that has microsecond precision.

Windows

This code gives the milliseconds within the last minute. If you want milliseconds since epoch or some other fixed point in time it will require a bit more math on the SYSTEMTIME struct.

Oct 192008
 

Russ Teabeault and I were just talking about our recent experiences with Objective-C and developing applications for either the iPhone or OS X. In general, we both agree that Objective-C is necessary, but painful. The language is quasi-dynamic, not very modern and poorly adopted.

Let me clarify that a bit. Objective-C is dynamic but lacks nice features that most modern dynamic languages provide such as closures (although these aren’t strictly tied to dynamic languages by any means). It also uses a fairly non-standard message passing syntax, that is probably because it had to select something that would not conflict with C and C++. It also still uses header files and lacks good namespacing, both of which are obvious signs of antiquity. Plus, we both shared stories of having impossible times finding good open source libraries, tools and framework. The last bit of our conversation was the extreme lackluster of the XCode IDE, which is more like a text editor than a modern development environment.

I mentioned that prior to 10.4, Apple had provided integration of most of the Cocoa libraries with Java and had even written some OS applications in Java. Not surprising, we both thought the shift from Java to Objective-C was a step in the wrong direction. It seemed as though the correct direction would be a better, faster and more integrated VM that would run many languages including Java, much like the CLR offers developers on Windows. Instead, Apple seems to be chugging along the CPU native path and providing bridges between Objective-C and various scripting languages. This means that each new language they want to support on the operating systems requires work to bridge effectively and that work can only be done well by Apple. If they had gone down the VM path, new languages would be supported once the community built support for them on the VM. Therefore, if I wanted to write my next application for OS X in Scala, I could.

This brought us to a discussion of the merits of Android for mobile development. This is where we had different opinions. I think Android will be great and the development community and support behind it will be much larger than the iPhone. He thought that Apple might release a Java VM for the iPhone if Android picks up steam and believes that the iPhone is still the best platform for mobile.

The reason I like the idea of Android (although the G1 might be a crappy implementation without the ease of use and cool features of the iPhone) because it was fundamentally VM based. This meant that the platform was based on Java-byte code rather than the CPU instruction set. It also meant that the platform had modern concepts like garbage collection, memory management, ClassLoading and all the other good parts of the VM. But it also meant that it will be capable of running any Java-byte code regardless of the language it is written in. Therefore, I could write my Android application in Groovy and with the help of a few extra JARs, get it running just as well as plain old Java.

(Update 10/22/08 Groovy doesn’t quite work yet, because of a number of additional classes and JARs it uses. However, the principle is sound because Android is VM based. JRuby might be a better solution than Groovy depending.)

The problem with the iPhone is that it is tied directly to the CPU instruction set and the Objective-C language. This means you take a step backwards 20 years and now have all the overhead of header files, pre-compilers, directives, manual memory management, no-namespacing, and numerous other headaches that VMs have done exceedingly well in fixing.

What I wonder is why Microsoft is the only operating system company to figure out that wiring a VM, capable of everything we except out of a modern VM plus application isolation, into the lowest levels of the OS is a great idea. In fact, this idea is fantastic. This means applications targeted to the VM improve as the VM improves, without any code changes.

The Java VM still has a long way to go to catch up with the CLR as a full platform. JSR 121 was finalized a few years back, which will provide good application isolation within a single Java VM instance. This means that you can start up the Java VM and then run multiple applications inside it without any conflicts or concerns about one application impacting the others (such as an application calling System.exit or running out of memory). The Java VM still needs extremely tight integration with the operating system and needs to be started when the OS boots. It also needs new ways to start applications and manage applications, which JSRs 277 and 294 should help with. I think though that Google has done all of this work for Android. It runs multiple applications in isolation well, it provides a mechanism for applications to start and stop and the Java VM is completely integrated into the operating system of the mobile device.

I think this is the way of the future. I’m just trying to figure out where Apple is going and why they are heading that way.