Long, long ago (OK so it was 1993, but it makes for a better opener,) I earned my no-code Tech ham license and used to know quite a lot about antenna theory, and be very interested in it. I would crack open the ARRL Handbook and just start reading wherever I landed (it's a shame I can't find that thing now.) I built quite a few 2m and 70cm antennas, and had a blast doing it. J Poles, quads, et cetera.
Fast forward to today, where i discover that it's fun to tinker with homebrew Wifi antennas. Unfortunately, my knowledge has gotten cloudy on the theory, which brings me to my question.
In the case of 2.4 GHz frequencies, where wavelengths are so short, how come designs are typically using 1/2 or 3/4 wavelength elements, and not full wave, or greater than full wave? It seems to me that a longer antenna would provide better reception. It may be intrinsic to the designs that I have been looking at, mainly stuff like this:
So, I'm basically thinking of building a simplistic omni antenna to improve on my Linksys' standard rubber ducks. But more importantly, I can't seem to find an answer to that simple question, which is why not use greater lengths when microwave frequencies make them so short? Just in the name of uber-compactness?