You can't! In the way I mean it, 14bits of each primary color. Looks like I have to go into those "complications". Camera sensors are monochrome. They read light be placing little filters over each pixel, either red, green or blue. Each pixel then "borrows" the 2 other colors it doesn't have. So if it is a red pixel, it take the green and blue color information from neighboring pixels to create a full 24bit color. (BTW, they don't work with RGB but YUV, oh this stuff is so f'ing torturous!) But, for explanation sake...
Let's say we're in a perfect world. You have 3 color values, each from 1 to 16,000 (red, green or blue). That means, from those, you can create a full color at 16k x 16k x 16k depth, or 4 trillion! You can't discern 4 trillion colors. So now you have more color information than you can physically see. In the end, we always need to reduce to 16 million.
Here's the rub. You can't see 4 trillion colors. The camera can record the 16 million you can see in 8bit video. So what's the problem? The camera may not chose the 16 million color values you would chose from a palette of 4 trillion colors. As the article shows, it is never smart enough to do that.
RAW allows you to SELECT which colors to scale down to your 16 million painting. As Andrew said, do you want to start with 4 shades of pink, or 255? It's all about CHOICE in what you want your final 8bit channel image to be.
Are we getting there?