Add to Google Reader or Homepage |
~ pjvenda / blog
$home . blog . photography

28 December 2008

More next year

I'm still lousy at this. But I'm taking a break from my stressful (?) blogging routine until sometime next month, in 2009.

Here's a short wish list of mine for next year:

* I would like to improve my photography technique. I would like to have a Canon EOS 5D mkII on my budget as well as a new tripod, at least two decent lenses and bibble 5 (which is proving to be a difficult birth). I will upload stuff to flickr.

* Some really exciting stuff is happening at the beginning of next year, which will be a worthy blog post when done. Until then, it's a surprise (even for me, to a certain level).

* I am also looking forward to book my leisure flight on a Cessna or a Piper somewhere in the UK - courtesy of my lovely wife. That will be amazing and bring me closer to aviation in a techie kind of way. A baloon ride would be excellent as well :)

* I will do a bit more of travelling around but that isn't well determined, yet. I know I'm going to Prague, but that's it so far.

* I *need* to do some go-Karting again as well as, perhaps, another track day experience.

* I should blog more often. I know that some of my friends read this sorry excuse of a blog so, it is in my best interest that they be informed of what's going on on this side of the channel that deserves being blogged about. But that's not always an easy task. Also my techie reviews and rants are sure to be useful for someone...

* I will update my website more often and, more importantly, with new interesting material. This includes interfacing with blogger and flickr.

So long. Back next year.
Until then: Cheers, PJ.

16 November 2008

A visit to the Harley-Davidson dealer

Hello everyone,

Yes, I succumbed to the insistent calling of this (un)holy place just 5 miles away from where I live.

Decisions, decisions, decisions, originally uploaded by pjvenda.

And I felt like a little kid, shopping for christmas, with an unlimited budget! Not that I had a budget or even the intention of buying anything, but the mere thought of possibly doing so in a couple years time, really messed with myself in a most unexpected way. I never thought that visiting a store could create such a warm and fuzzy felling.

It felt strangely familiar and strangely similar to the Hard Rock cafe franchise - the music and decoration, the staff's black "uniforms", etc.

There were so many bikes for sale - some seemed pretty rare by the looks of it and from the warning signs that said "please don't touch or sit". All those Sportsters, Softtails, Dynas, VSRCs... All those obscenely expensive extras: the chromed screamin'eagle exhausts, the black matte screamin'eagle exhausts, the classic open face helmets, the leather jackets, everything else-leather...

Of course I took the opportunity to snap a few shots. Not that the light was great. I mean, it was when I left home, but as the 10 minutes went by between my street and Wootton, the clouds came in...

But of all things, this was it:

When I walk in there again to order my own Sporster, that will be something to remember - I'm sure of it!!

Cheers, PJ.

28 July 2008

ACPI standards used for specificity

Even though the title might seem silly at first, the simple fact that it is there as it is means that there is some explanation behind. So true.

The open standard ACPI was released in December 1996 by a joint collaboration of the main industry vendors. Its purpose was to provide a common interface for describing hardware bits and pieces that might vary from computer to computer. Just like a hardware description language (HDL).

All motherboards today have a bit of ACPI code in their BIOSes that provide the list of sub devices that they have available for use. These sub devices range from internal timers to fan controls or temperature sensors. This happens because all motherboards are different, in a way or another. Even the most common and basic functional components may need to be configured in a slightly different way between different manufacturers. So this ACPI code is rather raw and important for the lowest levels of the operating system. [we could fork this conversation into the subject of "operating systems: hardware or software?" but we shan't. this post has a different purpose, so let's get on with it].

This, of course, is described according to the ACPI open standard independently of which operating system will be executed. When looking at different operating systems, they have varying degrees of ACPI compliance and robustness. Some have good ACPI interpreter implementations and do things well. They may be able to resist ACPI coding errors and still keep working well. Others are dyslexic with regards to reading ACPI right and break. Break because they read wrong or because they can't handle errors.

So ACPI writers decided to start differentiating their code with conditional bits that refer to specific operating systems. Bits like

if (os==os_identifier_string) {
do this;
} else {
do something else;
OK, so we start to see some specificity on ACPI code regarding different operating systems. But that's fine as it is being done to avoid or to fix existing or potential problems. On the other hand, Microsoft's operating systems have much more attention on this matter due to two reaons:
  • Windows OSs are regarded as the world's (only) operating systems. Manufacturers and vendors tend to ignore all other operating systems on end users' hardware;
  • Windows has never been any good at handling ACPI code;
Here is where things start to go wrong regarding ACPI...

My first laptop, an Acer Travelmate 4001 WLMi, had a few quirks in its ACPI code that resulted in some issues [I can't remember what exactly]. Because hacking ACPI code is a nightmare, there was an easier way to handle the. You see, fixes for those quirks were already in place, in the ACPI code itself, but in one of those conditional zones for Windows 2000 and XP. Not for other OSs. Fortunately it is possible to tell the Linux Kernel to identify itself as "Microsoft Windows XP" when decoding/interpreting the ACPI table, so I was able to fetch those fixes and get on with things. Remember: no need to hack or change the code because the fixes were already there.
Hacking ACPI code is not a walk in the park. It's not easy to find, not easy to decode, not easy to read and not easy to write.

In effect, ACPI fixes are being coded specifically for Windows. Fixes that are likely to be relevant and safe for all other operating systems, but because of manufacturers' disregard for everything non-windows, "the others" are being left out.

The dirtiest minds might imagine that this is, in fact, a very subtle (and smart) way to undermine support for non-Windows operating systems. All that's necessary is to discreetly influence [i.e. pay] manufacturers to include differentiated code that breaks things on Linux, BSDs, Solaris, etc. Companies could use an open standard to break compatibilities and enforce monopolies. Coward stuff, eh?

Well... it appears to be happening. Exactly that. See the (adapted) quoted text below along with my comments in between. [The full story can be found here:]

I've heard a lot of people ask, "Who the hell is Foxconn", if you use a PC, there's a good chance you have one of their boards, even if it's branded as MSI or some other brand, if you use a Nintendo Wii, XBOX 360, or Playstation 3, Foxconn made that motherboard, this isn't some little dodgy hardware maker with no name that we can afford to be quiet about.
I have disassembled my BIOS to have a look around, and while I won't post all the results here, I'll tell you what I did find.

They have several different tables, a group for Windws XP and Vista, a group for 2000, a group for NT, Me, 95, 98, etc. that just errors out, and one for LINUX.

The one for Linux points to a badly written table that does not correspond to the board's ACPI implementation, causing weird kernel errors, strange system freezing, no suspend or hibernate, and other problems, using my modifications below, I've gotten it down to just crashing on the next reboot after having suspended, the horrible thing about disassembling any program is that you have no commenting, so it's hard to tell which does what, but I'll be damned if I'm going to buy a copy of Vista just to get the crashing caused by Foxconn's BIOS to stop, I am not going to be terrorized.


Linux and FreeBSD do not work with this motherboard due to it's ACPI configuration, using a disassembler program, I have found that it detects Linux specifically and points it to bad DSDT tables, thereby corrupting it's hardware support, changing this and setting the system to override the BIOS ACPI DSDT tables with a customized version that passes the Windows versions to Linux gives Linux ACPI support stated on the box, I am complaining because I feel this violates an anti-trust provision in the Microsoft settlement, I further believe that Microsoft is giving Foxconn incentives to cripple their motherboards if you try to boot to a non-Windows OS.

The correspondence between Ryan (the techie that uncovered this stuff) and Foxconn support (manufacturer of the motherboard in question) follows. It's a bit edited for spacing reasons. This is fun!
ACPI issues, cannot reboot after having used suspend

Jul 22 08:37:53 ryan-pc kernel: ACPI: FACS 7FFBE000, 0040
Jul 22 08:37:53 ryan-pc kernel: ACPI: FACS 7FFBE000, 0040
Jul 22 08:37:53 ryan-pc kernel: ACPI: FACS 7FFBE000, 0040
Jul 22 08:37:53 ryan-pc kernel: ACPI: FACS 7FFBE000, 0040
Jul 22 08:37:53 ryan-pc kernel: ACPI Warning (tbutils-0217): Incorrect checksum in table [OEMB] - 70, should be 69 [20070126]

I get these messages in my system log at boot, I also fail to reboot after having used suspend in a session, it hangs and plays a continued beep on the PC speaker.

Foxconn support:
Do you get the same beep codes if you were to remove all RAM out and then turn the system ON again?

No, because then I wouldn't be able to boot into Linux, suspend to RAM, to get the ACPI failure, have syslogd pollute my /var/log/messages file with it, or read about it in my system log.

In particular, the number of quirks that the kernel has to use, and this invalid checksum are what has me nervous.

If you need me to attach the full contents of /var/log/messages, I can do so.

Foxconn support:
This board was never certified for Linux. It is only certified for Vista. See URL below. So please test under Vista. Does this issue also occured under Vista or Winxp?

That was a nice start... So the board is only certified for Vista, but by the way, does it work well in XP?? Just by accident, you know... Because it should, I/they reckon.
The ACPI specs are there for a reason, and broken BIOS's like what is in this motherboard are the reason standard ACPI does not work, I've taken the liberty of filing the report in, Red Hat, and Canonical's Ubuntu bug tracking systems, and posting the contents of my kernel error log on my blog, which is in the first several results if you Google search "Foxconn G33M" or "Foxconn G33M-s", "Foxconn Linux", etc, as well as prominently in other search formats, so hopefully this will save other people from a bad purchase, and hopefully can work around your broken BIOS in 2.6.26, as I understand that kernel is more forgiving of poorly written BIOSes built for Windows.

I've already gotten several dozen hits on those pages, so you guys are only hurting yourselves in the long run, by using bad BIOS ROMs, as people like me are quite vocal when dealing with a bad product.

Spot on: The ACPI specs are there for a reason.
Foxconn support:
Making idle treats is not going to solve anything.

As already stated this model has not been certified under Linux nor supported.

As you are unhappy with the product- using a non-support operating system nor certified, please contact your reseller for a refund.

I've been debugging your AMI BIOS, and the ACPI support on it is far from within compliance with the standards, I've dumped out the debugging data into Canonical's Launchpad bug tracking system so that we may be able to support some sort of a workaround for the bad ACPI tables in your BIOS, I would hope that you will be part of the solution instead of the problem, alienating customers and telling them to go buy a copy of Windows Vista is not service, your product claims to be ACPI compliant and is not, therefore you are falsely advertising it with features it isn't capable of.

I would ask that you issue an update that doesn't make it dependent upon Windows Hardware Error Architecture, but that decision is up to you.

Please find all relevant data here:

Bug #251338 in Ubuntu: “Bad ACPI support on Foxconn G33M/G33M-S motherboards with AMI BIOS”

I appreciate your consideration in this matter.

Foxconn support:
You are incorrect in that the motherboard is not ACPI complaint. If it were not, then it would not have received Microsoft Certification for WHQL.

Refer to:

As already stated, this model has not been certified under Linux nor supported.

It has been marketed as a Microsoft Certified Motherboard for their operating systems.

Of course threats don't solve anything. But saying that "this product is ACPI compliant because otherwise MS wouldn't have granted us the WHQL stamp" doesn't either. And they don't seem very willing to try anything else... So I guess the threat was still the only thing that caught Foxconn's support attention... even though it was promptly dismissed.
I've found separate DSDT tables that the BIOS hands to Linux specifically, changing it to point to the DSDT tables Vista gets fixes all Linux issues with this board.

So while I accept that you've gotten some kind of Microsoft Certification (doesn't surprise me), that does not make your board ACPI capable, just that Windows is better at coping with glitches custom tailored to it, for this purpose.

Foxconn support:
Stop sending us these!!!

Yes, go away!
Your BIOS is actually pretty shoddy, I've taken the liberty of posting everything that's wrong with the DSDT lookup tables and how to fix some of it so the community that has already purchased your filth can make do with it, also, it's now pretty much impossible to google Foxconn and Linux in the same sentence without getting hit by the truth, that your boards aren't good enough to handle it.

Foxconn support:
Surely this is the way to ask for us to attempt to fix something that is not supported in the first place.

Would it be so difficult? I mean really? I suppose you've never heard of building a happy customer base vs. just angering everyone that deals with your products to the point they make sure others don't make the mistake of buying them.

You know, I have several computers, and they all support any OS I want to put there, as well they should, if you can't fix the damaged BIOS you put there intentionally, can you at least put a big thing on the site that says no LInux support so people won't make the mistake of buying your stuff?

Your DSDT table looks like it was written by a first year computer science student, it is scary, I will not just shut up and go away until I feel like I've been done right, this can end up on Digg, Slashdot, filed with the FTC that you are passing bad ACPI data on to Linux specifically.

I saw you targeting Linux with an intentionally broken ACPI table, you also have one for NT and ME, a separate one for newer NT variants like 2000, XP, Vista, and 2003/2008 Server, I'm sure that if you actually wrote to Intel ACPI specs instead of whatever quirks you can get away with for 8 versions of Windows and then go to the trouble of giving a botched table to Linux (How much *is* Microsoft paying you?) it would end up working a lot better, but I have this idea you don't want it to.

While I don't agree with the way that Ryan addressed Foxconn support in all occations, he does have a rock-solid point. He has the right to be pissed and they were replying as (not very good) corporate robots, so deserve some angry kicks. If their policy was in fact "not supported - won't fix" than they would not react in such increasingly defensive way.

He was lucky to get enough attention to actually get them to act and try to fix things. No promises of official BIOS updates and no reports of working fixes, but they did do something. Of course they did/do/will deny any explicit attempt to break things on the Linux side. But I don't believe them. It's just too obvious to be accidental.

It takes someone with skills and will to figure these things out. A very small minority of all computer users. And that's exactly why this is actually a good subversive way to subtly improve Microsoft's monopoly. Indeed with the help of the ACPI open standard, corporations are breaking compatibility by introducing specificity in the code.

Sorry for the long rant, but this kind of stuff ticks me off! People that know me know it's true.

Thanks for reading,

02 July 2008

Photographic kit upgrade process pt.II

Following the lens upgrade earlier this year, the second part of this upgrade process involved acquiring a new camera body.

Budget issues aside, I reckoned that upgrading from an EOS 350D to a 400 or even 450D would not yield a satisfying technological leap (even though the 450D is already two generations ahead of the 350D).
Hence I started saving money to upgrade to a moderately priced mid-range or top-range body. Knowing that the terms "moderately priced" and "top-range" together are a physical impossibility (or a black hole shall form on the 3rd ring of Saturn) and that the 5D is getting *really* old now, I was left with one choice - the EOS 40D.

The hand grip is light-years better than the 350D. Not only because it's a physically bigger camera, but also because its surface is that hard rugged rubber. The 40D is also 27% heavier than the 350D which can be an issue, in particular if used with a heavy lens (like mine). So this new kit's total weight went up 36% (from 540+385=925g [350D+Sigma DG 18-125mm f/3.5-5.6] to 740+715=1455g [40D + Sigma EX 24-70mm f/2.8]). But I got used to it and the added performance largely justifies the added weight.

Having a higher pixel count was a nice addon to the upgrade's value, but it wasn't a decisive factor (also because the lower range 450D has an even denser sensor). However, the 40D's sensor has other less obvious but very interesting advantages like singnificantly lower noise at higher ISO values. I refused to use ISO1600 on the 350D and only very rarely I used ISO800. On the 40D, most images are well usable by high standards at ISO800 (although noise can be visible in shadowed areas) and only careful analysis will tell ISO100 from ISO400. So sensor noise is much better (less) on the 40D which, along with auto ISO selection allows for extra flexibility with less fiddling.

Partial weather sealing is another nice feature that Canon has added/improved since the 30D. L-class Canon lenses and the Canon 580EX II flashgun consolidate this feature. A large 3" LCD screen has been fitted mostly for the joy of the marketing department, and a Live-View mode in which the screen is supposed to work as a simpler point&shoot model. I find it useful sometimes but not that often. The image size increase was also well welcomed. As was the vast array of extra features.

With everything accounted for, it's a great camera body, it made a brilliant upgrade and with the Sigma EX 24-70mm f/2.8 MACRO it builds good semi-pro body kit.

So what's next? Something's already on its way. After that, I haven't decided yet. Maybe an ultra-wide angle lens, or instead a nice telephoto? Only time (and money) will tell.

Cheers, PJ.

12 June 2008

Simple puzzles and engineers

Recently, I've stumbled upon a mind puzzle suggested on etd's blog. It seemed pretty simple, so I gave it a little thought...

Just for the sake of completeness, here is the puzzle:

One morning, exactly at sunrise, a Buddhist monk began to climb a tall mountain. A narrow path, no more than a foot or two wide, spiraled around the mountain to a glittering temple at the summit. The monk ascended at varying rates of speed, stopping many times along the way to rest and eat dried fruit he carried with him. He reached the temple shortly before sunset. After several days of fasting and meditation he began his journey back along the same path, starting at sunrise and again walking at variable speeds with many pauses along the way. His average speed descending was, of course, greater than his average climbing speed. Prove that there is a spot along the path that the monk will occupy on both trips at precisely the same time of day.
Here is my go at it:
  • Considering that the path is an ascending spiral [from bottom to top of the mountain], it is possible to graph its height vs length as an injective function. This means that any point of height is occupied by a spot in the path, representing a single determined distance point from one of the ends of the path;
  • The monk will have to go through every point of height (or distance) to reach the top. He won't jump. This also means that height or distance can also be graphed against time as an injective and continuous function. At a given point of time, the monk cannot occupy two different height or distance spots;
  • The following sketch is my representation of the monk's distance covered along the day (distance vs time). I've superposed the graphs for the climb and descent.
  • Knowing that both climb and descend started at the same time, no matter how crazy the curves are (representing rests and different instant speeds along different bits of the path) they *have* to meet somewhere because they are continuous. The meeting place of the lines represents a single spot where the monk was at exactly the same time on both journeys.

After having conceived this analytical solution to the problem, I've realised this other one:
  • Both journeys are made on different days. So an equivalent situation would be to have two monks at the same time starting their journeys in opposite directions - one climbing and the other descending. They will meet along the way.
And this was just a quick (but rare) peek inside the mind of an engineer.

Cheers, PJ.

01 June 2008

Gnuplot in action

For us, mere humans, it is so much easier to understand technical data if it is presented in some form of graphical form - instead of tables filled with numbers. Coloured bars are easier to compare than two numbers side by side. In the same way, an analog voltmeter can, more clearly, show an abnormal condition, where the needle pointer moves too far than what we're expecting. But I stray from the subject...

So I like statistics, and I like graphs, and I blog about it. And then I'm suggested [by the author] an upcoming book about gnuplot: "Gnuplot in action".

You see, 'gnuplot' is a wonderful tool that takes data and draws (or plots) graphs to some output format (like an X window or an image file).
It's an incredibly compact unix-style tool that takes a configuration script and a text table of data (usually numbers and column names) to plot the graph into an image of some kind. It offers thousands of options and configuration parameters to produce pretty much every graph you might imagine [and beyond]. From useless to essential, from the very simple to mind-bogling complicated and elaborate. All for the sake of a better representation of "raw" data.

set term png
set output "sin_x.png"
plot sin(x)

The trivial script above makes gnuplot plot the graph shown in the image below:

Graphical plot of sin(x).

This slightly more elaborate script seen in plots the almost art-work graph shown below:

Graphical plot of a transfer function. This is the result of a complex gnuplot script.

The truth is that although gnuplot is does a superb job, it carries a pretty steep learning curve. Most of it I've learnt from examples, which obviously isn't enough.

And here is where "gnuplot in action" comes in. It's a book that tackles the depths of gnuplot, from the simple and straightforward all the way up to the very edge of its capabilities - the scripts of the gnuplot gods. It is also filled with reference material and examples.
The subject is introduced progressively. It says somewhere that the first few chapters alone cover pretty much every basic function of the program. The remaining are the meat of the book, the extra mile that makes it interesting. It's a great read and a really nice way to explore gnuplot, graphs and statistics (for those that, like me, have the little bug of statistics inside them).

I reckon that pretty much every unix-head sysadmin must have thought about using gnuplot, or at least uses a couple tools that produce graphics with it.

The book isn't ready just yet, you'll have to wait a couple more weeks, but if you do any work that does or may involve gnuplot, go on and buy this book. It's bloddy brilliant.

Here is the link again:

Cheers, PJ.

31 March 2008

"Sustainable" web development

Hi people,

Seen my website lately? Last week? Last month? Last year? It hasn't changed much has it... Actually since May last year (2007) it did not change a single byte.

Yes, I've been having some trouble keeping my website updated. Surely it's a personal web page and there is no business need to have it updated with new material, etc. But there is a personal motivation in keeping it interesting for others - friends, family, colleagues and other internet users. It's dynamic and should reflect myself somehow.

There are two ways to setup and maintain a website these days: The full blown CMS managed web application and the old-school terminal + vim + html method. Both have advantages and disadvantages. I reckon nobody uses the latter anymore... but me.

I grab my laptop, fire my old vim editor, edit my images and write my own combination of HTML and PHP code to make it a little more dynamic and to make things a little... easier. Bits of a CMS, I guess. This is how I write and maintain my website. However, it is getting a little big for the odd spare hour per week. I keep adding or changing parts of it (in my offline copy) and end up by never finishing any of my ideas of new material or other updates. This is the perfect way to keep my website statically un-updated, gathering dust on my server, upstairs.

Small, simple and regular evolutionary changes are what it takes to keep things evolving. Slowly, yes, but not still. Like the other guys say: "Release early, release often."

But even though I know what needs to be done, I keep implementing better and safer code to get things done better and faster. Like in a CMS. So nothing showed up for the last 10 months, heh!

Let's hope something comes up within the next month or so. Hopefully. All I need is to finish this ultra-'leet image linking system :)

Cheers, PJ.

04 March 2008

Photographic kit upgrade process pt.I

Just like anyone else wanting a little more from photography (other than shooting birthday parties, family dinners and other obvious events where cameras are compulsory) I decided to move on to a better lens.

I wanted a faster lens with better sharpness and lower optical distortion. Of course this means a heavier, bigger, less flexible and particularly more expensive lens than the one I have been using.

So I took the scientific approach (also described in this blog) to analyse my current picture database, and from what I learned, last January I bought the Sigma 24-70mm EX f/2.8.

Here is a technical review of this lens at (which is one of the best lens review resources that I know of. Very technical and with a quantitative approach whenever possible):
I mean, my Sigma 18-125mm DC f/3.5-5.6 is very flexible, compact and light - but that comes at a cost - it compromises everything else. It features notorious barrel distortion at the lower focal lengths and noticeable pincushion distortion at the higher focal lengths, along with clear vignetting at the highest apertures (very pronounced at 18mm f/3.5). Also, it is very soft, slow to focus and sometimes not very accurate (although the camera may share some of the guilt).

After this paragraph of bashing the 18-125mm, I must give it credit for what it has been good for. I have it for a little over two years and it was terrific for learning SLR techniques and to gain valuable experience. I would recommend it to anyone starting with a DSLR because of its flexibility and good price/features ratio.

Specs for both lenses can be seen in the Sigma pages below:
About the 24-70mm, I must say I was impressed by its size and weight at first - it felt and seemed massive!

Canon EOS 350D + Sigma EX 24-70mm f/2.8
Image used under permission of the authors @

When compared to the 18-125mm DC, it is 33% wider in diameter (82mm vs 62mm) and almost twice as heavy (751g vs 385g). But it's also much faster (f/2.8 available across the entire focusing range), miles sharper and produces much less distortion. Being a lens designed for full frame digital cameras, it looses a bit of flexibility when used in cropped sensors (roughly equivalent to a 38.4-112mm on a sensor with a crop factor of 1.6x). I don't care. It's great, I love it to bits.

The result... See my flickr page for archived photos taken on or after 15th January 2008 (you may start here: archive for 2008/01/15) or search by tag "Sigma EX 24-70mm f/2.8" (tag search: "Sigma EX 24-70mm f/2.8").

What's next on my photo kit upgrade process? It will take a while before I'll buy another lens, but having this superb 24-70, I'll probably go for a cheap 70-300mm (or similar) or an expensive fisheye or ultra-wide angle (like 10mm or so). Still don't know. Other than quality glass, I'll replace my camera as soon as I can afford it.

Cheers, PJ.

25 February 2008

Yesterday at 'Focus-on-Imaging' trade show

Hi everyone,

I've been at the Focus-on-Imaging trade show yesterday. For those that don't know what it is, it's a commercial event where manufacturers and resellers of just about anything related to photography show what they have to offer to the public. This is the biggest event of its kind in the UK. Similarly important trade shows are:

  • PMA
  • PhotoKina
  • WPPI
  • CES
Some brands display their newest products for the first time at trade shows like these, right before releasing them to the public. However, no remarkable premiers are supposed to happen at focus-on-imaging, sadly.

Some very well known brands were represented either by an exclusive stand or by a reseler's one (in alphabetic order):
  • Adobe
  • Alpa
  • Bowens (studio lighting)
  • Canon
  • Datacolor (the Spyder guys)
  • Epson
  • Fujifilm
  • Hasselblad (medium format & digital back gurus)
  • HP
  • Lastolite
  • Leaf (digital back gurus)
  • Lexar (wicked ice statue for a stand advertising compact flash memory cards)
  • Kodak
  • Mamyia (medium format gurus)
  • Manfrotto (tripods)
  • Nikon
  • Olympus
  • Phase One (digital back gurus)
  • Sigma
Some of the bits that impressed me the most were (in no particular order):
  1. A wicked automated camera that takes 360 degree shots with very high definition and capable of handling a whopping 26 f-stops of dynamic range. The SpheroCam HDR seems a pretty complex setup, with a hefty tripod a special camera and a computer for processing a proprietary format VR image. Costs about £9000.;
  2. The medium format kits with their digital backs and film magazines. The recent Hasselblads look very cool. I wasn't able to play with one, probably because I didn't ask;
  3. The small studio at the leaf stand with fixed lighting (at least 2 continuous flashes), a black background, a set of (at least) three cameras (medium format, leaf digital backs) and a pair of high-heel glossy red shoes (the object);
  4. The enormous Sigma 200-500mm f/2.8 lens;
  5. Playing with one Canon EOS 1Ds mkIII with IS 400mm L lens;
  6. Playing with one Canon EOS 5D with a fisheye lens;
  7. Alpa cameras, which I don't understand, to be honest;
  8. The Canon stand was miles better than the Nikon one, unlike what I expected;
  9. The Canon cashback deals 24th to 27th February only;
I must say that the £6 for attending the show are worth so much more if you bring your wallet packed - show discounts were really good.
As an example, Canon has enabled the rebate system again just for the trade show. The cashback rebate for the EOS 40D body was £100, which was already being sold a a very good price. So I could have bought a new 40D body with a new BG-E2 battery grip for £749 - £100 (rebate). That's over £130 cheaper that the best price I can find outside the trade show today.

I really enjoyed it - it was a great experience and a great way to see the pros and their equipment - most of which I'll never lay my hands on :)

For those that might still be going, it's open until the 27th of February. Tickets cost £6 for non professional visitors and the parking fee is £8.

Cheers, PJ.

28 January 2008

Tried a (new) VW Golf 1.6 FSI

Hi everyone,

Yes, besides being a geek, I am also a car nut. Sorry about that.

I had the chance to try a VW Golf 1.6 FSI (Mk V, 2007) over a weekend in the middle of December and thought I should share my opinion about it.

Picture of a 5 door VW Golf Mark V taken from wikipedia. Click on the image above for a larger version.

The Golf has been around for some decades now, improving with its every iteration. It has always been some kind of landmark of build quality - at a price, of course - that other manufacturers try to achieve with their own equivalents.

Its build quality shows both on the inside and on the outside. Things feel solid and well put together. The car is confortable and has all the mininum gadgetry one might want (cd player, air conditioning, cruise control, on board computer, ...). I am not fond of the blue dials but that's a VW tradition...

The engine is a 4 cylinder (straight), 1.6 litre, with direct stratified injection (FSI). It offers 115bhp and is coupled with a 6 speed manual gearbox. It is joyful to drive and powerful enough but seems a bit lazy under 3000 rpm at higher speeds. Not unexpected but not an issue either.
Overall it was pretty fun to drive. Felt like it wanted to be abused a bit, but public roads don't not let me enjoy my deserved fun!

Fuel consumption was higher than I expected. It took 30 litres of petrol to do about 240 miles in motorways (at or below 70mph). That yields some 30 mpg or 8 l/100 km. Bad Golf, bad!

Considering only its reliability, building quality, driving and aesthetic point of view, would I have it? Yes, please!

Now considering everything that there is to consider about this Golf, would I buy it? Erm... No. It implies a considerably higher initial cost when compared with its close competitors and its fuel consumption is worse that what I would tolerate for a car with a commuter/family hatch soul.

Just to clarify, the fuel issue would not apply to the R32, for example. That is obviously allowed to drink whatever amounts of fuel it needs.

Cheers, PJ.

16 January 2008

Focal length histogram

A warning first: this is highly geek stuff! Walk away if you don't have the stomach... or patience.

I was having a hard time in deciding what lens to buy next, I decided to rationalise things:

  1. I need a better lens;
  2. Better lens also means less flexible (less compromised);
  3. Being less flexible, my new lens should provide the focal range that I prefer;
Right, but what focal range is that?? It's not something that I can say from the top of my head like "My preferred focal range is exactly between 35 and 39mm!"...

So, after picking the idea from Tadeusz, I decided to do something similar

Knowing that my photo database has a little over 10000 shots, I wrote a small(ish) ruby program that goes over every picture file, extracts the focal length at which they were shot from EXIF metadata and produces a nice histogram with the help of gnuplot. Hopefully, by analysing the peaks of the histogram, I would be able to assess what is my preferred focal length range.

I will release the script on my website (in the "Programming - Ruby" section) after I fix a couple of annoying bugs in it.

The resulting graph run at 08 January 2008 looks as follows:

Focal length histogram of about 8000 pictures ranging between 18 and 125mm - Click on the screenshot to enlarge.

Roughly 8000 images were analysed and I only included the focal length range of my lens on the histogram. All focal lengths were adjusted to a 1.6x crop factor, but since the 18-125mm already takes that into account, focal lengths don't shift place on the histogram. By discarding the extreme ends of the above focal range (18 and 125mm), I got the following histogram:

Refined focal length histogram by discarding the extreme ends of the focal range in question - Click on the screenshot to enlarge.

Some extra notes before extracting results from the graph:
  • Instead of having a large number of histogram bars placed at the whole range of focal lengths, some discreet spikes show at about 17 different focal lengths. This seems to suggest either the lens is sending rough estimates of the focal length in use to the camera or the camera itself is aggressively rounding that information before writing the EXIF data;
  • The extreme ends of the assessed focal range were discarded because when using the ends of the lens, I would probably prefer to use lower or higher focal lengths respectively when using the minimum or maximum available focal length of the lens;
Clearly I am a wide-angle kind of photographer. The ranges I have used most often are between 18 and about 40mm with a little spike around 55mm.
Now that I have a scientific answer to the question: "What focal lenghts do I use more often?" maybe I can now decide which lens to buy? Well, I have decided and I have already bought it. Oddly it does not fit to the peaks of the histogram as well as I would like due to a number of reasons:
  1. I still want some degree of flexibility - or else I would be buying one or more prime lenses;
  2. I want excellent optical quality for the lower focal lengths - fast, sharp and with low distortion;
  3. The Canon 24-70mm f/2.8 L is absolutely out of the question for financial reasons;
Stand by for the result.

Cheers, PJ.

07 January 2008

Software toolkit for photography post processing: from Linux to Mac OS X

I've been getting used to Mac OS X Tiger and I must repeat the obvious: It's bloody great!!

One of my current time sinks is the post-processing of digital photos ranging from trivial sharpenings to time consuming panorama builds and HDR experiments. Of course, being a Linux geek, my toolkit is all Linux-based and reasonably optimised with tens of shell and ruby scripts to automate the boring repetitive stuff. The main software components are currently composed by exiftool, ufraw, digiKam, hugin, qtpfsgui and The Gimp.

Now if only I could use all those tools on OS X...

  • exiftool - A tool for reading, writing and editing meta information of images and other types of media.
Easy and done. There are binary versions of this command line tool ready to be installed on to the system from the home page of the project.
  • ufraw - A utility to read and manipulate raw images from digital cameras.
Easy and done. I compiled v0.13 from sources with a basic set of dependencies supplied by fink. Pierre Andrews went through the trouble of creating a nice .app package himself and made it available here: Thanks Pierre. It runs on X11.
  • qtpfsgui - An open source graphical user interface application that aims to provide a workflow for HDR imaging.
Seems easy but not quite done yet. The main website has a very nice .app binary build of version 1.8.12. The only thing missing is support for hugin's align_image_stack, but Mandus ( together with Ippei Ukai ( are working on getting it added. If I recall correctly, this is linked to carbon or cocoa i.e. it does not run on X11.
  • digiKam - An advanced digital photo management application.
Or headache #1. The main problem with getting this to work in OS X is its close integration with the KDE desktop environment. It has tons of dependencies like libkipi, kipi-plugins, libkexiv2, libkdcraw, libgphoto2, liblcms, libpng, libtiff, libjpeg, ... and it is damn hard to get everything working and working together. Fink has one decent build of digiKam 0.9.3 which compiles and installs fine, but the final result could be better - it does not process all the EXIF/IPTC that the native Linux version does and the GPS locator function (a kipi plugin) does not seem to work. Both of these are functions that I use extensively.
How wonderful it would be to have digiKam properly packaged into one .app (just like qtpfsgui) and linked to qt4-mac instead of running on X11... I'll just keep working on getting it running - compiling dependencies and dependencies of dependencies until eventually I get it right.
Or headache #2. Again tons of dependencies prevent me from cleanly building a snapshot of the SVN trunk. The building system even attempts to create a nice .app package but then it does not work. Unlike digiKam, though, there seem to be alternatives. Some good-hearted people who are able to build these snapshots successfully and package them nicely into .app packages, cutting the number of external dependencies as much as possible (ideally there should not be any dependency in .app packages other than OS X's system libraries).
Both Ippei Ukai and Harry van der Wolf have HuginOSX .app builds available on their websites. Harry's seem to be the most recently updated: These snapshots run natively on cocoa, although there are some important dependencies that must be interpreted by the mono framework.
  • The Gimp - the GNU Image Manipulation Program. It is a freely distributed piece of software for such tasks as photo retouching, image composition and image authoring. Functionally analogous to Photoshop.
Easy and done. An .app binary build of version 2.4.0rc3 may be downloaded here: It runs on X11, though.

Once I get all these working properly, I think I will be both happy and sad at the same time. Happy because I had been able to port my post-processing photography software kit to Mac OS X, which is great. But sad because by then, I will have no good reason for going back to Linux...

Cheers, PJ.

My website is back online!

It's a shame that it is so out of date... but it has physically moved about 1000 miles ( recently and went offline for a few days because of a number of reasons.

Finally it is running again on the "new" server.

All I have left to do is... update it, of course. I have lots and lots of stuff to put there and I hope I'll start updating it again soon. Meanwhile, check it here:

Cheers, PJ.