Interesting article on web 2.0 from Nicholas Carr, in which he talks about gratis-but-amateur-and-often-crappy vs. costs-money-but-professional. Following links from IT Doesn't Matter (which he also wrote, and I guess it caused a big furor), I found why IBM and Dell are the most successful hardware companies. The claim in brief: because Dell is the cheapest possible adequate stuff, and IBM is the most advanced premium-priced stuff, and everyone else is a kind of half-assed compromise.
Half-baked thoughts:
I guess "low-cost provider" and "high-cost provider" are already well-known business strategies if you're an MBA-holder, so maybe this isn't news. I don't really know what the received wisdom might be.
This was on TV when I powered it on tonight. It involves Sylvester Stallone singing country music, for example while sitting on a picnic blanket with Dolly Parton strumming acoustic guitar.
IMDB trivia item:
Original screenwriter Phil Alden Robinson was so offended by Sylvester Stallone's extensive re-working of his original screenplay that he briefly considered having his name removed from the film's credits. He was later convinced that having his name on a film of this "caliber" would look good on his resume.
We are currently hiring a software developer at Red Hat to join a small project team I'm on. The first attribute we are looking for is that you are an extremely good developer, where "good" means both "smart" and "professional."
Slightly more detailed requirements of the current project are:
When/if we have a more thorough and official job description I will post that also, but in the meantime don't be shy about sending a resume to hp@redhat.com, put "JOB" in the subject, and I will forward it along. If you have some sample code that is legally permissible to share with us, some examples of your work would be extremely helpful as well. Hope to hear from you.
Did Epiphany seriously switch away from tag-based bookmarks right when the whole web 2.0 crowd is having a collective hype-gasm about tagging and folksonomy? Epiphany was there first! Obviously the missing piece is to make tags appear in a larger font according to popularity, thus harnessing more collective intelligence.
So web 1.0, people. Get with it.
Thomas, the whole thing you're doing (figuring out what to start) is not intended to exist - the norm would be that the daemon starts automatically in the init script, and GNOME autostarts the tray icon. But I believe the RPM doesn't autostart stuff yet because it's all beta/experimental/dorks-only and it breaks on certain kernel modules or wireless cards.
It's not a complicated architecture. NM daemon controls the kernel, and the UI for it is a GNOME tray icon or applet. You start the two halves. At least as of the previous network manager, this is what you had to launch:
$ su - # service NetworkManager start # exit $ /usr/libexec/nm-applet & $ bg; disown
In an on-by-default NM the init.d script would be in runlevels 3/5 and nm-applet would be in the GNOME session, nothing to do manually.
Your log error looks like either dbus isn't reloading the security config post-install of NM, or NM daemon simply isn't running.
That said, NM doesn't work for me either, but it's the fault of the kernel drivers. The airo module with whatever version of the firmware I have gets into a wedged state where it just prints debug spam over and over and doesn't work. Whatever race condition or sequence of syscalls causes the wedging doesn't happen often when just doing "ifup" and "ifdown" rather than NetworkManager's scanning etc. I have triggered it from the command line before though usually in a "trying to find a network" case where I'm running iwlist and bringing things up and down much as NM would do.
Dan kindly documented kernel problems discovered in the course of hacking on NetworkManager.
I think Chris is right about this, Java's convenience has been hampered by the license. JPackage helps on some level but I'm not convinced putting Java into the UNIX binary packaging mold is exactly what you want, and JPackage can't redistribute all the Java core runtimes that people use. As gcj/Classpath matures it will help obviously. There's also the whole Maven thing, in C programmer terms you could think of this as "kind of like automake, but automatically downloads dependencies." Anyway, the de facto reality in the Java world is that you just bundle all dependencies with your app. Equivalent of static linking.
On the plus side though, the amount of open source code written in Java is huge, largely server-side stuff. It tends to be much more sane and reusable than your average C/C++ library, because the "GLib" equivalent layer (collections, threads, IO, etc.) is a given and thus libraries can interoperate with each other and with applications. Another factor I'd say is that writing robust Java code is a lot easier than writing robust C code and more people are able to write acceptable libraries.
Open source is kind of "winning" in the Java world; from what I can tell, the new EJB3 spec more or less has nothing to do with previous versions of EJB. The persistence part looks like a rebranding of the open source Hibernate project, using Java 5 annotations to dump Hibernate's XML configuration. Here's a tutorial on it, and it's pretty cool. I have no idea about the Java community or what happened, but it certainly seems like a coup for open source.
The resulting situation is very strange, where everything below the JDK is open source (Linux, etc.) and everything above it is also open source (huge stack of Java code out there). Even the IDEs are open source. All this open source stuff effectively defines the real, complete platform that people use in practice. But it's very difficult to ship that platform working "out of the box" because there's this closed JDK in the middle. Classpath will save us...
The oddest thing about Java has to be their obsession with calling everything a "bean." As Colin points out, because they started calling every object a "bean," they had to rename objects to the special term "POJO" (plain old java object). As far as I can tell, in the modern Java world you can pretty much do s/bean/object/ on the docs with no loss of information.
I have a new JavaScript theory which is that they were LISP-envious. They were all "you know, Lisp had it right to design the language around the One True data structure - they just had the wrong one. So our new language is not list processing, it's HASH TABLE PROCESSING. HOLY SHIT. SCOPES ARE HASHES. OBJECTS ARE HASHES. TYPES ARE HASHES. LISTS ARE HASHES. EVERYTHING YOU TYPE AUTOMATICALLY BECOMES A HASH ENTRY. WOOOOOHOOOOOOOOOOO"