the subs should do better if the linking is done right, plus since they are individual domains, you should get credit for back links to your own stuff, maybe not with google but yahoo and bing, example
link on home of keyword.com to sub1.keyword.com showing sub1 + keyword in title and text is a very strong exact match link
in your example diet.cakerecipes.com compared to cakerecipes.com/diet
the sub gets all the exact match values, whereas the directory would have to be cakerecipes.com/dietcakerecipes
so the SE's algorith will rate the sub as a better value in exact url matches, the keyword is a level 3 match and the sub is a level 4 match and the directories are irrelevant
www. is level 1
.com/net/org/etc is level 2
keyword (.tld) is level 3
sub.(keyword.tld) is level 4
so 100% the sub has more value
now the highest value would be to have subkeyword.com without the sub actually being a sub
so exact match level 3 is almost a direct page 1 serp unless you kill the site with something the SE's ignore (like frank schilling does)
so if you are trying to figure out what do SE's value more
100%
subkeyword.com top value
sub.keyword.com next in line
keyword.com/sub last in line
first example is exact match level 3
2nd is exact match combo level 3 and level 4
3rd is irrelevant
now you hear all this well the subs are not giving credit to the main keyword.com
so
you have 100 subs, all with anchor links to the keyword.com
if they're really 100 different sites, then right there major seo pop since you have 100 different domains (all subs) linking back to keyword.com
so without one link from an external site you have 100 backlinks
now some say subs over X amount get ignored
not true, different content will be indexed
you can search google for domains with many subs
when you see pages of indexed subs the claims over X are ignored are shown to be wrong
I'd get as many matching sub.keywords as I could to have mates
sub.keyword.com and subkeyword.com/net/org/few others
so then I would be a daisy chain of sites all interrelated
get a class C and a server and you have 100 or so class C pops all hitting the anchor
oh, use privacy to stealth the new google whois sniffer
google isn't really that 'advanced' if you understand how it rates stuff
what they're trying to patent (whois sniffers and other stuff) etc
so a nice keyword like cakerecipes.com with lots of subs and mated to sister .com's and .nets could rule that keyword and all the catgories
this is basically why a name like insurance.com sold for so much
keyword.com is a natural for subbing and mating with catkeywords.com
category subs with mates in .com and other high value tld's like net/org
will make the algorithm at google hum
wow look at all these exact phrase .com's and .net's linking to the subs on the root keyword of that industry
exact match .com's are GOLD
subs have more value than directories in the SE's algorithm and all the jive about well it's not on the main site, so what, it's another site with back links to the anchor site
is that a bad thing?
I don't think so
and a directory is not a level 4 value asset
so keyword .com is the highest value and catkeyword.com or subkeyword.com is a better value in the algorithm than the sub or the directory
so
level 3 exact matches is top value
subs next
directories last
now if anyone wants to show me info about a directory being more valuable than a level 4 sub with examples to PROVE IT, then as far as I am concerned a level 4 sub will always rate higher with a SE for name value than any directory
very few people are even aware that's how the whole net map is valueed
www.=level 1
.tld = level 2
keyword.tld = level 3
sub.keyword.tld = level 4
but that's the exact algorithm all the SE's use
it's why exact match urls can pop the SEs so easily
unless you kill them with stuff they can't read, like schilling does