After telling you about my completely reworked VIM setup and the awesome VIM plugins I'm using in it, this blog post deals to share with some snippets and settings I'm using to make all that stuff work round. [Read full post…]
As promised, I'm explaining my new VIM setup in some more detail, staring with the collection of plugins I use by now to enhance my VIM experience. I already have some in mind which I need to try out, but I would be happy to know which additional plugins you can recommend in relation to PHP programming (and beyond). Please leave a comment! [Read full post…]
I finally managed to re-work my VIM setup completely, so it's time for an updated. First the sad news: VIP (VIM Integration for PHP) is dead. To relativate this: I still provide my current VIM setup via Github to deal for you as an inspiration. For more compensation I also have a good news: I rewrote PDV - phpDocumentor for VIM completely and it now nicely integrates with current plugin managers. Its code is now way cleaner and maintainable and it has some fancy new features like templating support through my Vmustache implementation and support for making the generated doc blocks UltiSnippets snippets for even more ease in editing them (waving to Textmate users). [Read full post…]
I love LaTeX for any kind of text writing (actually typesetting), simply because it creates so nice looking and consistent layouts. And, of course, because I can write it in my favorite text editor. We use LaTeX especially for presentation slides at Qafoo, since the beamer package provides such a convenient environment. Combined with listings package, presenting source code snippets with nice syntax highlighting has never been easier. However, there was one problem we did not solve, yet, until some days ago: Highlighting certain source code lines of a listing on different slides. [Read full post…]
I recently had the problem that I wanted to retrieve the smallest items from a stream of data. When talking about a stream here, I refer to a data set that I do not want to load into memory completely, since it has quite a few elements. The best way to process such data is a stream approach, where you work always on a single item at a time, iteratively, without loading the full data set.In my special case, I had a database with 140,000 records. The processing of these records could not happen in the DB, since I needed to create vectors from text and perform calculation on these. Basically, I needed to check each vectors distance to a reference vector and keep only the k closest ones.So, what is a good approach to solve such a task? I decided to implement a custom data structure based on a max heap to solve the problem. In this article, I present the solution and compare it to two different other approaches in terms of a small benchmark. [Read full post…]
Since last November I'm the happy owner of an Android smart phone. Actually, I already have my second one: After the HTC Hero, which was already amazing, I now have the Motorola Milestone (aka Droid). This blog post lists and reviews the apps that I think every Android phone should have installed. [Read full post…]
So, it's exactly one year ago that I started using Twitter. I remember that I was always of the opinion that Twitter is one of the most stupid hypes nowadays and that it's so useless to know when other people have a cup of coffee or go to the toilette. Using Twitter actively for exactly one year now proofed this attitude wrong. I'm providing some stats on my live with Twitter in this article and try to explain my personal value of using Twitter in. [Read full post…]
Derick was always bitching at me when doing releases for the huge amount of time needed to process the Webdav components sub directory. We always supposed, that the Subversion performance issues here resulted from the Webdav test suite, which consists of an awful lot of small test files and some sub-directories. I finally found the time to refactor the tests and the performance improvement is astonishing. [Read full post…]
The annual family party (aka International PHP Conference) took place a week ago. As usual I enjoyed it much, although I was a bit sick this year. The new location (the "Rheingoldhalle" in Mainz) is really great, but I had the feeling it was a little bit too large. Professional equipment was available and they had 4 large rooms, as well as some smaller ones, for talks. Even the food was very good, which was not the case for the past 2 years in Mörfelden (Frankfurt). [Read full post…]
Last night the Mozilla developers released Firefox 3.0 to the wild. Congratulations for the new major version! [Read full post…]