Multimedia
Sep 26, 01:44 PM
well i might be getting a mac pro soon (not sure yet)
but if i do, my question is when will we see an 8-core mac pro?My GUESS is Probably November or December at the latest. It will Probably simply be a Dual Clovertown Processor option added to the current BTO page with a new processor pricing lineup. It will Probably be a silent upgrade with a press release.
but if i do, my question is when will we see an 8-core mac pro?My GUESS is Probably November or December at the latest. It will Probably simply be a Dual Clovertown Processor option added to the current BTO page with a new processor pricing lineup. It will Probably be a silent upgrade with a press release.
bugfaceuk
Apr 9, 09:41 AM
If Nintendo doesn't adapt, it could be big trouble for them. I've seen the 3DS (http://photics.com/nintendo-3ds-a-surprising-disappointment) and I'm not impressed. I think the iPhone 4 is a much better portable gaming machine.
I've just read the linked article... cannot stop laughing at
"Closing one of my eyes would also cancel the [3D] effect"
You know how stereoscopic vision works, right?
I've just read the linked article... cannot stop laughing at
"Closing one of my eyes would also cancel the [3D] effect"
You know how stereoscopic vision works, right?
Multimedia
Oct 8, 10:30 AM
I meant quad-core package (socket) - be it Clovertown/Woodcrest or Kentsfield/Conroe.
On a multi-threaded workflow, twice as many somewhat slower threads are better than half as many somewhat faster threads.
Of course, many desktop applications can't use four cores (or 8), and many feel "snappier" with fewer, faster cores.
_______________
In one demo at IDF, Intel showed a dual Woodie against the top Opteron.
The Woody was about 60% faster, using 80% of the power.
On stage, they swapped the Woodies with low-voltage Clovertowns which matched the power envelope of the Woodies that they removed. I think they said that the Clovertowns were 800 MHz slower than the Woodies.
With the Clovertowns, the system was 20% faster than the Woodies (even at 800 MHz slower per core), at almost exactly the same wattage (1 or 2 watts more). This made it 95% faster than the Opterons, still at 80% of the power draw.
You can see the demo at http://www.intel.com/idf/us/fall2006/webcast.htm - look for Gelsinger's keynote the second day.I thought so. This is the first time I have seen the term "Multi-Threaded Workflow" and I thank you for that. In the Gelsinger Keynote he calls it "Multi-Threaded Workloads".
I'm glad to see you confirm my suspicion that the 2.33GHz Dual Clovertown Mac Pro will in fact be faster than the 2.66 or 3GHz Dual Woodie when someone knows how they work simultaneously with a set of applications that can use all those cores a lot of the time. Very exciting.
Also thanks for the link to all those sessions from the IDF. Fantastic to be able to "attend" all of them. I'm stoked and looking forward to watching them ALL. I love all the new Intel self-promotional videos. Intel is happening and hip!
And no premium for that "ninth" processor when you buy a 2.66GHz Dual Clovertown after all bringing the total cost to $3,699 plus ram. So now I hope there will be TWO new lines in the Processor section of the Customize Your Mac page of the online Store:
Two 2.33GHz Quad-Core Intel Xeon [Add $800]
Two 2.66GHz Quad-Core Intel Xeon [Add $1200]
I now think I will opt for the 2.66GHz 8-core for $3,699 if Apple will offer it for sale.
The first 8 being a little over $400 each. With the 2.66 you get 2.64GHz more total power so it's like getting a ninth processor for +$400 IOW for no premium. Maybe Apple will only offer the 2.66GHz Clovertown so as not to confuse the buyers.
Wonder if the 2.66GHz Clovertown introduces heat issues under the hood.
On a multi-threaded workflow, twice as many somewhat slower threads are better than half as many somewhat faster threads.
Of course, many desktop applications can't use four cores (or 8), and many feel "snappier" with fewer, faster cores.
_______________
In one demo at IDF, Intel showed a dual Woodie against the top Opteron.
The Woody was about 60% faster, using 80% of the power.
On stage, they swapped the Woodies with low-voltage Clovertowns which matched the power envelope of the Woodies that they removed. I think they said that the Clovertowns were 800 MHz slower than the Woodies.
With the Clovertowns, the system was 20% faster than the Woodies (even at 800 MHz slower per core), at almost exactly the same wattage (1 or 2 watts more). This made it 95% faster than the Opterons, still at 80% of the power draw.
You can see the demo at http://www.intel.com/idf/us/fall2006/webcast.htm - look for Gelsinger's keynote the second day.I thought so. This is the first time I have seen the term "Multi-Threaded Workflow" and I thank you for that. In the Gelsinger Keynote he calls it "Multi-Threaded Workloads".
I'm glad to see you confirm my suspicion that the 2.33GHz Dual Clovertown Mac Pro will in fact be faster than the 2.66 or 3GHz Dual Woodie when someone knows how they work simultaneously with a set of applications that can use all those cores a lot of the time. Very exciting.
Also thanks for the link to all those sessions from the IDF. Fantastic to be able to "attend" all of them. I'm stoked and looking forward to watching them ALL. I love all the new Intel self-promotional videos. Intel is happening and hip!
And no premium for that "ninth" processor when you buy a 2.66GHz Dual Clovertown after all bringing the total cost to $3,699 plus ram. So now I hope there will be TWO new lines in the Processor section of the Customize Your Mac page of the online Store:
Two 2.33GHz Quad-Core Intel Xeon [Add $800]
Two 2.66GHz Quad-Core Intel Xeon [Add $1200]
I now think I will opt for the 2.66GHz 8-core for $3,699 if Apple will offer it for sale.
The first 8 being a little over $400 each. With the 2.66 you get 2.64GHz more total power so it's like getting a ninth processor for +$400 IOW for no premium. Maybe Apple will only offer the 2.66GHz Clovertown so as not to confuse the buyers.
Wonder if the 2.66GHz Clovertown introduces heat issues under the hood.
Piggie
Apr 28, 04:51 PM
I ran a dialup BBS from 1983-1992 and we had p0rn, FidoNet Email, discussion forums, software downloads, etc....
The Internet made stuff faster, more graphical, and brought stuff to a wider audience - but for us early birds, everything has always kinda been there.
I used a few Bulletin boards on old 300 baud modems, and also Prestel in the UK at 1200/75 speeds.
Don't know how many here are old enough and UK enough to remember using Prestel.
http://en.wikipedia.org/wiki/Prestel
The Internet made stuff faster, more graphical, and brought stuff to a wider audience - but for us early birds, everything has always kinda been there.
I used a few Bulletin boards on old 300 baud modems, and also Prestel in the UK at 1200/75 speeds.
Don't know how many here are old enough and UK enough to remember using Prestel.
http://en.wikipedia.org/wiki/Prestel
JackAxe
Apr 8, 10:58 PM
I hope they poach someone that likes BUTTONS.
Sydde
Apr 27, 10:39 AM
The Jesus toast. Verified to look like Jesus or Jeff Daniels.
283096
No, no, I know who that is! He wrote lots of scripture (unlike Jesus):
Oh the day divides the night
Night divides the day
Try to run
Try to hide
Break on through to
The other side
And the verse that everyone would do well to heed,
Show me the way to the next whiskey bar
283096
No, no, I know who that is! He wrote lots of scripture (unlike Jesus):
Oh the day divides the night
Night divides the day
Try to run
Try to hide
Break on through to
The other side
And the verse that everyone would do well to heed,
Show me the way to the next whiskey bar
AidenShaw
Oct 29, 12:33 PM
[QUOTE=AidenShaw;2994604]For example:
Thread_ID tid[4];
for (i=0; i<System.CPU_count(); i++)
{
Jennifer Aniston#39;s tried and
Jennifer Aniston Hairstyles
Jennifer Aniston#39;s Hairstyles
Jennifer Anniston Haircuts
2009 Sedu Hairstyles For Fall/
jennifer aniston hairstyles
Jennifer Aniston Hairstyles
Jennifer Aniston High Ponytail
2009 long straight hairstyle
Jennifer Aniston with Long
Celebrity Hairstyles Jenifer
Jennifer Aniston short Spring
Thread_ID tid[4];
for (i=0; i<System.CPU_count(); i++)
{
Popeye206
Apr 21, 09:03 AM
So are you going to tell me that paying for tethering ON TOP OF DATA YOU ALREADY PAID FOR is fair? Data is data is data... 4gb is 4gb no matter how I use it. Tethering cost are a joke!:mad: /end rant
You are joking right?
Fair or not, it's not Apple's fault. It's the carriers who have imposed this structure and probably fair. They do have to be able to support the extra data traffic if tethering was just open for anyone without paying. Personally, I think it's a waste anyway. At home it's WiFi... on the road it's my iPhone or I find WiFi if I need it for my laptop which is not hard to do.
Anyway... like it or not, it's not a free service today. Is it fair? I don't think so either and I think in the long run phone companies will bundle it in with the data packages. As well as having multiple devices assigned to the same plan so you can have one data plan that your smart phone and tablet can share.
But for now... it is what it is and if you're not paying for it, well, what can I say... good for you.
You are joking right?
Fair or not, it's not Apple's fault. It's the carriers who have imposed this structure and probably fair. They do have to be able to support the extra data traffic if tethering was just open for anyone without paying. Personally, I think it's a waste anyway. At home it's WiFi... on the road it's my iPhone or I find WiFi if I need it for my laptop which is not hard to do.
Anyway... like it or not, it's not a free service today. Is it fair? I don't think so either and I think in the long run phone companies will bundle it in with the data packages. As well as having multiple devices assigned to the same plan so you can have one data plan that your smart phone and tablet can share.
But for now... it is what it is and if you're not paying for it, well, what can I say... good for you.
Tulse
Mar 19, 01:09 PM
I find it rather surprising how blindly people here defend Apple, even after seeing how they remove your rights little by little. How many times can you burn your iTunes-songs to CD? It used to be ten times. But Apple reduced it to seven.
As I recall, the limit is on how many times you can burn a specific playlist. You can burn a song an unlimited number of times. This is a big difference.
manu chao said:If you go to a concert, theatre play, any kind of performance or into any of fee-charging class or course and smuggle yourself in through some kind of backdoor without paying for the ticket or the course, did you steal anything?
This is an excellent analogy, manu chao. Everybody knows that it is wrong to sneak into a movie theatre, but for some reason people think it is OK to copy music illegally. It is just bizarre.
It seems to me that the issue is pretty darned simple -- as a potential user of iTMS you know what the rules are. If you don't want to abide by the rules, don't use the service. Any talk of "it's actually helping Apple" or "it's my music to do with as I want" is just self-justifying bull. If you don't like the rules, don't play. It's really that simple.
As I recall, the limit is on how many times you can burn a specific playlist. You can burn a song an unlimited number of times. This is a big difference.
manu chao said:If you go to a concert, theatre play, any kind of performance or into any of fee-charging class or course and smuggle yourself in through some kind of backdoor without paying for the ticket or the course, did you steal anything?
This is an excellent analogy, manu chao. Everybody knows that it is wrong to sneak into a movie theatre, but for some reason people think it is OK to copy music illegally. It is just bizarre.
It seems to me that the issue is pretty darned simple -- as a potential user of iTMS you know what the rules are. If you don't want to abide by the rules, don't use the service. Any talk of "it's actually helping Apple" or "it's my music to do with as I want" is just self-justifying bull. If you don't like the rules, don't play. It's really that simple.
ezekielrage_99
Aug 30, 07:27 AM
Is 99 for your year of birth? It's not like there's ten of them. You've probably had too many nightmares about Woodstock.
Which woodstock are we talking about? I hope the new one in the 90's that one was sweet.
Which woodstock are we talking about? I hope the new one in the 90's that one was sweet.
Hastings101
Apr 5, 08:36 PM
Are you guys sure that switching is really "worth it"? (serious question)
I don't think it's really worth it. Windows 7 and Snow Leopard are so close together in quality that OS X is no longer obviously the better operating system (in my opinion of course). It's also a pain to have to replace your entire collection of Windows applications with Mac versions or Mac alternatives.
The only reason I still use OS X is because I like the look of it, I like that there are (at the moment) less viruses/trojans/whatevers, and I have way too many Mac only applications that I depend on.
I don't think it's really worth it. Windows 7 and Snow Leopard are so close together in quality that OS X is no longer obviously the better operating system (in my opinion of course). It's also a pain to have to replace your entire collection of Windows applications with Mac versions or Mac alternatives.
The only reason I still use OS X is because I like the look of it, I like that there are (at the moment) less viruses/trojans/whatevers, and I have way too many Mac only applications that I depend on.
MacBoobsPro
Oct 26, 10:36 AM
16 cores in 2007
32 cores in 2008
64 cores in 2009
128 cores in 2010
You want to wait 'til 2010 at the soonest? :rolleyes:
4 years. Cant wait. My emailing exploits will just zip along.
How many chips would it span though?
32 cores in 2008
64 cores in 2009
128 cores in 2010
You want to wait 'til 2010 at the soonest? :rolleyes:
4 years. Cant wait. My emailing exploits will just zip along.
How many chips would it span though?
Applespider
Mar 20, 06:27 PM
I switch all the time on this issue. For the most part, DRM doesn't get in the way of anything I do so I think 'what the hey!'
Then I envision wanting to make a silly video and using some music with it (which I could do if I'd burned it off a CD) and not being allowed to with the iTMS stuff. And yes, I know that the CD way is illegal too but until the RIAA make a very easy way for Joe Public to be able to pay a nominal amount for a very limited distribution, not-for-sale video, people are going to do it illegally.
Then I envision wanting to make a silly video and using some music with it (which I could do if I'd burned it off a CD) and not being allowed to with the iTMS stuff. And yes, I know that the CD way is illegal too but until the RIAA make a very easy way for Joe Public to be able to pay a nominal amount for a very limited distribution, not-for-sale video, people are going to do it illegally.
mi5moav
Sep 20, 08:29 AM
I have a feeling that Apple and Disney are going to partner up on this ITV and somehow integrate MovieBeam into it. I am sure there are already plans in the work. Disney has cut the price on this great technology and this is one piece of technology I wouldn't give up. So, much better then running to the store and the definition of the movies are great. For $52 bucks you get you own video Store. Decent prices on rentals. A lot better then $299 no way will I get iTV but for $199 with moviebeam built in it's possible.
Marx55
Oct 26, 03:09 AM
ONE THING IS CLEAR:
Multitasking, multiprocessor, multithreading Mac OS X and applications are needed right now and will be much needed in the future.
Because microprocessors will evolve not with more Mhz, but basically with more cores and more microprocessors per Mac.
And the same on Linux and Windows. So, hopefully, default true multithreading is around the corner. Or else all this power will be wasted for most applications.
JUST IMAGINE A COMPUTER IN WHICH EACH PIXEL IS CONTROLLED BY A SINGLE PROCESSOR.
Multitasking, multiprocessor, multithreading Mac OS X and applications are needed right now and will be much needed in the future.
Because microprocessors will evolve not with more Mhz, but basically with more cores and more microprocessors per Mac.
And the same on Linux and Windows. So, hopefully, default true multithreading is around the corner. Or else all this power will be wasted for most applications.
JUST IMAGINE A COMPUTER IN WHICH EACH PIXEL IS CONTROLLED BY A SINGLE PROCESSOR.
joncdixon
May 6, 01:59 AM
I have been with Sprint, T-Mobile and Now AT&T. Moving to AT&T the day after the release of the 2G iPhone.
How can this story be marked as new?!?
For the past 3 years I have told it like it is....
The iPhone is the best device on the planet: on the worst possible network!
With huge profit sharing, I feel Apple will never leave AT&T.
I will continue to use my 3G until the day they release of version 4.
How can this story be marked as new?!?
For the past 3 years I have told it like it is....
The iPhone is the best device on the planet: on the worst possible network!
With huge profit sharing, I feel Apple will never leave AT&T.
I will continue to use my 3G until the day they release of version 4.
eawmp1
Apr 22, 08:33 PM
Why?
Look up Pascal's wager
Not a fan of Pascal's assumption of Christianity as the basis for his theorem.
Look up Pascal's wager
Not a fan of Pascal's assumption of Christianity as the basis for his theorem.
mahonmeister
Oct 25, 10:51 PM
I just got my mac pro a month and a half ago.
And you shall continue to enjoy it. Like Arn has stated, this likely isn't replacing any current configurations, just adding to them.
This seems really exciting. All these cores are gonna pump out some serious power. Now if they could just mash together that processor that IBM made at like 50GHz (I think they cooled it with dry ice or something) with a multi core processor they'd have something! Bring it.
And you shall continue to enjoy it. Like Arn has stated, this likely isn't replacing any current configurations, just adding to them.
This seems really exciting. All these cores are gonna pump out some serious power. Now if they could just mash together that processor that IBM made at like 50GHz (I think they cooled it with dry ice or something) with a multi core processor they'd have something! Bring it.
goobot
May 5, 11:50 AM
I blame the iphone. Its a hog and kills atts network. If it was a diff phone this wount be happening. Apple needs to make it work with the network better.
Multimedia
Oct 12, 11:08 AM
You're welcome. You take the plunge? I'm torn between the 30" or two 24" monitors. I'm thinking I may buy one 24" now, then pick up another monitor on Black Friday--hopefully after I've purchased a new Mac Pro.You think Dell will sell them for even less on Black Friday? - November 24 for you unfamiliar with the term.
Yeah I hope Apple decides to pull the trigger on the Clovertown option as soon as they can get product. That would be so cool if they do.
Definitely not 2 x 24" I think. I have the 24" and it's native HD which I use all the time as my primary television set with EyeTV2 and an EyeTV 500 - now EyeTV Hybrid. But I think the idea of having the 30" with the 24" is kind of ultimate - in a stock video card withtout a fan kind of way.
I almost did buy a second 24" in August. Then I decided that I would wait, pay down my Dell credit some more, then get the 30" which is what I'm doing now thanks in part to your coupon deal although I was prepared to pay $1444. The coupon saved me another $104 including tax. Very excited. Should arrive next week sometime.
Another reason to have a 30" is to wed it to a MacBook Pro as the ultimate mobile home base second screen. But I think the next screen I buy will be another 24 for my Mac Pro 8-core so I can leave this one on the Quad G5 that I am not selling. I think the combo of the 24" + 20" is the best budget way to go - total just over $1k. But if you're going to spend more it might as well be a little over $2k for the 24" + 30" I think.
So I'm going to wind up with:
24" + 20" on both the 2GHz Dual Core (got at Fry's for $864.26 in August) and Quad G5s
24" + 30" on the 8-Core Mac Pro.
I like the idea of having a 24" on everything because it is capable of displaying HD in its native resolution - not bigger not smaller.
But if Dell starts selling the 30" for $999 then all bets are off. :D
Having never spent any length of time with a 30", it is probably too soon to tell how much I will want two. My hunch is: a lot. :p
Yeah I hope Apple decides to pull the trigger on the Clovertown option as soon as they can get product. That would be so cool if they do.
Definitely not 2 x 24" I think. I have the 24" and it's native HD which I use all the time as my primary television set with EyeTV2 and an EyeTV 500 - now EyeTV Hybrid. But I think the idea of having the 30" with the 24" is kind of ultimate - in a stock video card withtout a fan kind of way.
I almost did buy a second 24" in August. Then I decided that I would wait, pay down my Dell credit some more, then get the 30" which is what I'm doing now thanks in part to your coupon deal although I was prepared to pay $1444. The coupon saved me another $104 including tax. Very excited. Should arrive next week sometime.
Another reason to have a 30" is to wed it to a MacBook Pro as the ultimate mobile home base second screen. But I think the next screen I buy will be another 24 for my Mac Pro 8-core so I can leave this one on the Quad G5 that I am not selling. I think the combo of the 24" + 20" is the best budget way to go - total just over $1k. But if you're going to spend more it might as well be a little over $2k for the 24" + 30" I think.
So I'm going to wind up with:
24" + 20" on both the 2GHz Dual Core (got at Fry's for $864.26 in August) and Quad G5s
24" + 30" on the 8-Core Mac Pro.
I like the idea of having a 24" on everything because it is capable of displaying HD in its native resolution - not bigger not smaller.
But if Dell starts selling the 30" for $999 then all bets are off. :D
Having never spent any length of time with a 30", it is probably too soon to tell how much I will want two. My hunch is: a lot. :p
2ndname
Apr 10, 10:16 PM
I like to run both. I love Apple due to their simplicity. IMO it just focuses on working. You want to edit photos, get an Apple that is powerful enough to do that and it just works. You want to just surf the web, get an entry level Apple based on your preference (iPhone/iPad/Notebooks/etc.) and it just works. I'm actually waiting on the iMac refresh to set my fiance up with her first Apple desktop. I got her an iPad 2 and she loves it. For her, it's great to have a product that will work based on what she needs it for (Movie watching, surfing the web, editing photos). The fact that it looks clean and modern is a plus.
As for myself, I work in the IT field and our shop runs Windows. I love building my own rig every year and it keeps me current with the ever evolving computer technology. I'm glad that we have options.
As for myself, I work in the IT field and our shop runs Windows. I love building my own rig every year and it keeps me current with the ever evolving computer technology. I'm glad that we have options.
mpstrex
Aug 29, 04:09 PM
Actually, he's on the Al Gore movement. ;)
NO! Al Gore is in it for himself? I thought he was a selfless guy, out for the environment. I mean, his movie DID make over $20 million and the budget was REAL low, and the majority of the crew worked on it for free...
http://boxofficemojo.com/movies/?id=inconvenienttruth.htm
NO! Al Gore is in it for himself? I thought he was a selfless guy, out for the environment. I mean, his movie DID make over $20 million and the budget was REAL low, and the majority of the crew worked on it for free...
http://boxofficemojo.com/movies/?id=inconvenienttruth.htm
roadbloc
Apr 9, 06:15 PM
It's all about the platform.
Not the games then? I guess that is why the Pippin was such a tremendous success. Less than 80 games, but a great bit of hardware inside the box. Everyone wanted one. :rolleyes:
Not the games then? I guess that is why the Pippin was such a tremendous success. Less than 80 games, but a great bit of hardware inside the box. Everyone wanted one. :rolleyes:
wdogmedia
Aug 29, 02:26 PM
I didn't know we had a climate scientist in this forum, let alone one of the tiny percentage of scientists who dispute that human activity is a large factor in current climate change? Please enlighten us... that is, unless you're just some guy with an uneducated opinion. By all means, tell us why you know so much more about this well-studied topic than the hundreds of thousands of climate researchers around the world who've reached an almost unprecedented consensus regarding the roll of human activity, and CO2 production, in climate change.
30 years ago climate scientists warned us to expect an imminent ice age....it even made the cover of Time, if I'm not mistaken.
I noticed that you didn't dispute the fact that the dominant greenhouse gas is water vapor. This is not a disputable fact; no climate scientist will argue with you there. Global warming is also not a disputable fact; it is well-documented and has been occuring since records were first kept. However, saying that scientists have reached an "unprecedented consensus" is absolutely false; and would that even matter? How often do you read a story on CNN or MSNBC that begins with the phrase "Scientists NOW think...." Science is in its very nature an evolutionary process, and findings change over time. Who remembers when nine of out ten doctors smoked Camels more than any other cigarette?
I'm ranting now, sorry. The point is that I've never heard a satisfactory answer as to why water vapor isn't taken into effect when discussing global warming, when it is undeniably the largest factor of the greenhouse effect. But according to the Department of Energy and the EPA, C02 is the dominant greenhouse gas, accounting for over 99% of the greenhouse effect....aside from water vapor. This certainly makes C02 the most significant non-water contributor to global warming...but even then, climate scientists will not argue with you if you point out that nature produces three times the CO2 that humans do.
Forty years ago, cars released nearly 100 times more C02 than they do today, industry polluted the atmosphere while being completely unchecked, and deforestation went untamed. Thanks to grassroots movement in the 60s and 70s (and yes, Greenpeace), worldwide pollution has been cut dramatically, and C02 pollution has been cut even more thanks to the Kyoto Agreement. But global warming continues, despite human's dramatically decreased pollution of the atmosphere.
No climate scientist will argue the fact that global climate change has, in the past, universally been the result of cyclical variances in Earth's orbit/rotation, and to a lesser degree variances in our Sun's output. Why then, since pollution has been reduced dramatically, and since climate change is known to be caused by factors outside of our control, is it so crazy to believe that we're not at fault anymore?
And since when does being in a "tiny percentage" denote right/wrong? Aren't you a Mac zealot? :)
30 years ago climate scientists warned us to expect an imminent ice age....it even made the cover of Time, if I'm not mistaken.
I noticed that you didn't dispute the fact that the dominant greenhouse gas is water vapor. This is not a disputable fact; no climate scientist will argue with you there. Global warming is also not a disputable fact; it is well-documented and has been occuring since records were first kept. However, saying that scientists have reached an "unprecedented consensus" is absolutely false; and would that even matter? How often do you read a story on CNN or MSNBC that begins with the phrase "Scientists NOW think...." Science is in its very nature an evolutionary process, and findings change over time. Who remembers when nine of out ten doctors smoked Camels more than any other cigarette?
I'm ranting now, sorry. The point is that I've never heard a satisfactory answer as to why water vapor isn't taken into effect when discussing global warming, when it is undeniably the largest factor of the greenhouse effect. But according to the Department of Energy and the EPA, C02 is the dominant greenhouse gas, accounting for over 99% of the greenhouse effect....aside from water vapor. This certainly makes C02 the most significant non-water contributor to global warming...but even then, climate scientists will not argue with you if you point out that nature produces three times the CO2 that humans do.
Forty years ago, cars released nearly 100 times more C02 than they do today, industry polluted the atmosphere while being completely unchecked, and deforestation went untamed. Thanks to grassroots movement in the 60s and 70s (and yes, Greenpeace), worldwide pollution has been cut dramatically, and C02 pollution has been cut even more thanks to the Kyoto Agreement. But global warming continues, despite human's dramatically decreased pollution of the atmosphere.
No climate scientist will argue the fact that global climate change has, in the past, universally been the result of cyclical variances in Earth's orbit/rotation, and to a lesser degree variances in our Sun's output. Why then, since pollution has been reduced dramatically, and since climate change is known to be caused by factors outside of our control, is it so crazy to believe that we're not at fault anymore?
And since when does being in a "tiny percentage" denote right/wrong? Aren't you a Mac zealot? :)