FoxyKaye
Feb 22, 06:04 PM
And the general consumers don't really care when some sweaty geek foams at the mouth how much he hates Flash. They just want to be able to see all of the web, in its full Flash glory.
For better and for worse.
I happen to be one of those Geeks foaming at the mouth about flash, and in general, I think that the reason why Adobe was so upset by Jobs' recent comments that they're lazy and all their products are bloated and inefficient is because they hit to close to home.
But you're also very right - the general consumer doesn't care about these points. On some level everyone "knows" that the Web "requires" flash, and without it they're not getting the full "experience." It's an easy hit for the competitor's marketing department to play up the full flash experience on devices that support it in comparison to the iPhone and iPad. Jobs can scream all he wants about HTML5 on the horizon, however, this isn't going to be fully realized for some time. Likewise, too many sites rely too heavily on flash content for its absence to not be felt.
I think not supporting flash is a mistake, despite its technical flaws. Maybe this is all just a play by Apple to get Adobe to make some real and necessary improvements to flash in the first place - especially in how it taxes processor cycles and affects battery life on OS X (and presumably the iPhone OS as well). It wouldn't surprise me at all to see some magical "reconciliation" between Apple and Adobe on this point sometime this year as the iPad hits the consumer market.
For better and for worse.
I happen to be one of those Geeks foaming at the mouth about flash, and in general, I think that the reason why Adobe was so upset by Jobs' recent comments that they're lazy and all their products are bloated and inefficient is because they hit to close to home.
But you're also very right - the general consumer doesn't care about these points. On some level everyone "knows" that the Web "requires" flash, and without it they're not getting the full "experience." It's an easy hit for the competitor's marketing department to play up the full flash experience on devices that support it in comparison to the iPhone and iPad. Jobs can scream all he wants about HTML5 on the horizon, however, this isn't going to be fully realized for some time. Likewise, too many sites rely too heavily on flash content for its absence to not be felt.
I think not supporting flash is a mistake, despite its technical flaws. Maybe this is all just a play by Apple to get Adobe to make some real and necessary improvements to flash in the first place - especially in how it taxes processor cycles and affects battery life on OS X (and presumably the iPhone OS as well). It wouldn't surprise me at all to see some magical "reconciliation" between Apple and Adobe on this point sometime this year as the iPad hits the consumer market.
Chupa Chupa
Apr 28, 07:39 AM
No surprise the iPad is just a fad and people are starting to realize how limited it is. Its frustrating on a lot of cool websites and no file system makes it very limited.
You apparently missed the part of the report that says:
A combination of strong Q4 sales and the announcement of the iPad 2's launch across major markets at the end of March contributed to Apple's iPad shipments being down 31% sequentially. The full impact of the iPad 2 launch will not register until subsequent quarters, as Apple gets the product into the hands of consumers.
Interpretation in english:
Two major factors contributing to the sequential decline of iPad sales this quarter:
1) A lot of consumers received an iPad 1 as a holiday gift and did not need a 2 and
2) Apple's larger multi-country launch caused inventory constraint and Apple was unable to sell more because they didn't have any excess to sell; i.e., it's a really popular device and we anticipate that being reflected in next quarter's sales report.
You apparently missed the part of the report that says:
A combination of strong Q4 sales and the announcement of the iPad 2's launch across major markets at the end of March contributed to Apple's iPad shipments being down 31% sequentially. The full impact of the iPad 2 launch will not register until subsequent quarters, as Apple gets the product into the hands of consumers.
Interpretation in english:
Two major factors contributing to the sequential decline of iPad sales this quarter:
1) A lot of consumers received an iPad 1 as a holiday gift and did not need a 2 and
2) Apple's larger multi-country launch caused inventory constraint and Apple was unable to sell more because they didn't have any excess to sell; i.e., it's a really popular device and we anticipate that being reflected in next quarter's sales report.
Lord Blackadder
Mar 14, 03:11 PM
Then, "burn cleanly" is a dubious concept. Even if you can clean it up, how much does that cost, how much energy dies it take to clean it up, and how much do you lose from the coal's potential energy? Industry touts clean coal, others claim the very concept is a myth, I am not sure who is closer to the practical reality of the situation.
"Clean coal" is 100% myth, marketing-speak invented by coal companies to fool people. At best, we can have "less dirty coal". Scrubbers, filters, and other "clean coal" technology reduce pollution but also efficiency, so the cost of the equipment is not the only tradeoff. The only truly "clean coal" is the stuf you don't burn.
With that being said, it is incumbent on us to use the lowest-polluting process for burning coal that is practicable, so "clean coal" technology is important in that sense. But the notion that we can some how burn coal "cleanly" is false.
"Clean coal" is 100% myth, marketing-speak invented by coal companies to fool people. At best, we can have "less dirty coal". Scrubbers, filters, and other "clean coal" technology reduce pollution but also efficiency, so the cost of the equipment is not the only tradeoff. The only truly "clean coal" is the stuf you don't burn.
With that being said, it is incumbent on us to use the lowest-polluting process for burning coal that is practicable, so "clean coal" technology is important in that sense. But the notion that we can some how burn coal "cleanly" is false.
skunk
Apr 24, 07:20 PM
Those verses you quoted are, as I said, historical.They purport to be the historical record of the exhortations to kill of El himself, much as the Quran.
fxtech
Apr 28, 08:16 AM
Next year you will see iPhones and iPods counted too. I mean you need to do all you can to make it look good to shareholders.
Why not? After all, isn't an iPod Touch just a small iPad?
Why not? After all, isn't an iPod Touch just a small iPad?
Edge100
Apr 15, 12:23 PM
Those priests obviously weren't celibate, then.
Yes, it really does suck that there are bad people everywhere.
No, they weren't celibate. They're criminals. So is your pope.
So yes, it sucks that there are bad people everywhere. It also sucks that the Catholic church ordained so many child rapists, and then sheltered the child rapists from criminal prosecution by covering up child rape, and then relocated the child rapists so that more children could be raped.
Yes, it really does suck that there are bad people everywhere.
No, they weren't celibate. They're criminals. So is your pope.
So yes, it sucks that there are bad people everywhere. It also sucks that the Catholic church ordained so many child rapists, and then sheltered the child rapists from criminal prosecution by covering up child rape, and then relocated the child rapists so that more children could be raped.
Nermal
Mar 18, 03:23 PM
Does anyone know how to use the app? The readme file is empty :confused:
Eidorian
Jul 14, 02:09 PM
Dual optical drive slots are a must. I love my Mirrored Drive Door at work for this fact.
fivepoint
Mar 16, 01:32 PM
That chart isn't going to fool anyone with a brain. All it shows is what is currently implemented. It says nothing about the potential contributions of all sources, how much they cost per watt, how much pollution they produce or whether or not they are renewable. It's a colorful red herring and you know it.
For one thing, there's no need for you to try to be a shill for the nuclear, oil, gas and coal industry - they already have well-financed lobbying operations and huge political influence. They'll get on fine without your "help". For another, it goes without saying that fossil fuels and nuclear are going to be used until they are gone. The energy demands are too great to do othwerise.
But they are called "non-renewable" energy sources for a reason, and they all pose major pollution problems that we are still struggling with. There is absolutely no good reason not to aggressively pursue the development and adoption of renewable energy sources as soon as is practical. Some day they will produce the bulk of the world's energy out of necessity if nothing else.
So in other words, without non-renewable energy, human civilization falls? That's a ridiculous stance.
The things we hope are reality and things that actually are reality often times greatly differ. People sing the praises of wind and solar, but the honest to God truth is that they can't compete. Not even close. It takes THOUSANDS of giant windmills to produce what one tiny nuclear power plant can. Can we put those in your back yard? Or how about off of your state's coast? How about solar... how long exactly does it take for a solar cell to pay for itself? The chart shows that despite heavy federal subsidies that such alternatives are STILL wholly incapable of doing the job we'd need them to do without nuclear, coal, oil, natural gas, etc. The ONLY one that has proven it's worth is hydro. That that was created out of pure invention, not a government subsidy.
Let the free market determine which technologies win. Stop wasting our money on advancing idiotic technologies which haven't been able to prove themselves after 20+ years of subsidies. If there's wealth to be earned by developing such a technology, it will be developed.
Oh come on! You know what the answer to that will be. Panic wins every time as it makes better TV. :rolleyes:
Potassium Iodide tablets (retail $10 bottle) going for $500 on eBay. People are so stupid sometimes...
Yes, people have much potential for stupdity. They also have much potential to accomplish great things. Even (especially) without government holding their hands.
How's that going to work? People have to be fed too...
You're operating under a few false assumptions. First, bio fuels do not have to compete with food at all. Switch grass, moss, algae digesters, etc... its a quickly evolving world. Second, a great deal of our food price is wrapped up into transportation of said food. Third, using corn for fuel doesn't mean people go hungry, it only means that the price of corn goes up. Consequently prices of other goods might go up or down. What we probably agree on is that ethanol, etc. should not be subsidized.
For one thing, there's no need for you to try to be a shill for the nuclear, oil, gas and coal industry - they already have well-financed lobbying operations and huge political influence. They'll get on fine without your "help". For another, it goes without saying that fossil fuels and nuclear are going to be used until they are gone. The energy demands are too great to do othwerise.
But they are called "non-renewable" energy sources for a reason, and they all pose major pollution problems that we are still struggling with. There is absolutely no good reason not to aggressively pursue the development and adoption of renewable energy sources as soon as is practical. Some day they will produce the bulk of the world's energy out of necessity if nothing else.
So in other words, without non-renewable energy, human civilization falls? That's a ridiculous stance.
The things we hope are reality and things that actually are reality often times greatly differ. People sing the praises of wind and solar, but the honest to God truth is that they can't compete. Not even close. It takes THOUSANDS of giant windmills to produce what one tiny nuclear power plant can. Can we put those in your back yard? Or how about off of your state's coast? How about solar... how long exactly does it take for a solar cell to pay for itself? The chart shows that despite heavy federal subsidies that such alternatives are STILL wholly incapable of doing the job we'd need them to do without nuclear, coal, oil, natural gas, etc. The ONLY one that has proven it's worth is hydro. That that was created out of pure invention, not a government subsidy.
Let the free market determine which technologies win. Stop wasting our money on advancing idiotic technologies which haven't been able to prove themselves after 20+ years of subsidies. If there's wealth to be earned by developing such a technology, it will be developed.
Oh come on! You know what the answer to that will be. Panic wins every time as it makes better TV. :rolleyes:
Potassium Iodide tablets (retail $10 bottle) going for $500 on eBay. People are so stupid sometimes...
Yes, people have much potential for stupdity. They also have much potential to accomplish great things. Even (especially) without government holding their hands.
How's that going to work? People have to be fed too...
You're operating under a few false assumptions. First, bio fuels do not have to compete with food at all. Switch grass, moss, algae digesters, etc... its a quickly evolving world. Second, a great deal of our food price is wrapped up into transportation of said food. Third, using corn for fuel doesn't mean people go hungry, it only means that the price of corn goes up. Consequently prices of other goods might go up or down. What we probably agree on is that ethanol, etc. should not be subsidized.
dmelgar
Sep 12, 07:31 PM
Sounded like a downer to me. I haven't seen the presentation, so maybe its better than the story sounds.
- Whatever happened to a Tivo killer? No TV? No DVR?
- Sounds like this doesn't have a hard drive, supposed to display on a TV a video bitstream received via network connection. There are already many devices out there that do this, starting at $99. What makes this any better? Big problem with those so far is that you need an excellent 802.11g connection or you get dropouts when playing a DVD. Ethernet is the only thing that makes it reliable.
- 1Q2007? Since when does Apple pre-announce. They've been working on this for over a year and 1Q2007 is the best they can do? I wonder what the holdup is. Missing the Christmas shopping season? Horrors!
- Movies on iTunes. What DRM is associated with the movies? Can you burn the movie to a DVD to play in a DVD player? How do the prices compare to buying a DVD. If its similar price, I get much more on a DVD, ie special features, can play anywhere.
- No rental? Why not. I'm much more likely to rent a movie than buy one. I'm more likely to value the convenience of renting quickly online vs. driving to a store. But to buy and keep forever, I'd rather get a DVD.
- What movies? Only from Disney? Doesn't sound very impressive. What would make other studios jump on the bandwagon? I thought Apple would come up with something revolutionary that would drag the studios in. But I don't see it yet.
- Whatever happened to a Tivo killer? No TV? No DVR?
- Sounds like this doesn't have a hard drive, supposed to display on a TV a video bitstream received via network connection. There are already many devices out there that do this, starting at $99. What makes this any better? Big problem with those so far is that you need an excellent 802.11g connection or you get dropouts when playing a DVD. Ethernet is the only thing that makes it reliable.
- 1Q2007? Since when does Apple pre-announce. They've been working on this for over a year and 1Q2007 is the best they can do? I wonder what the holdup is. Missing the Christmas shopping season? Horrors!
- Movies on iTunes. What DRM is associated with the movies? Can you burn the movie to a DVD to play in a DVD player? How do the prices compare to buying a DVD. If its similar price, I get much more on a DVD, ie special features, can play anywhere.
- No rental? Why not. I'm much more likely to rent a movie than buy one. I'm more likely to value the convenience of renting quickly online vs. driving to a store. But to buy and keep forever, I'd rather get a DVD.
- What movies? Only from Disney? Doesn't sound very impressive. What would make other studios jump on the bandwagon? I thought Apple would come up with something revolutionary that would drag the studios in. But I don't see it yet.
likemyorbs
Mar 26, 12:17 AM
Matthew 5:10-12
Irrelevant. Don't throw bible verses at us, it's not helping your point, but i can understand that you're using it as a last ditch effort because you realize you have no point.
PS
Matthew can go F himself. Your religion has no place in our laws, we do not live in a christian nation. Get over it.
Irrelevant. Don't throw bible verses at us, it's not helping your point, but i can understand that you're using it as a last ditch effort because you realize you have no point.
PS
Matthew can go F himself. Your religion has no place in our laws, we do not live in a christian nation. Get over it.
speedriff
Feb 17, 06:51 AM
So what is your job at Apple? The problem with Apple (trust me I love their products) is that they don't care what their customers want concerning wireless products. They seem to only change when they might lose marketshare. If you call that good business then I suggest a history lesson of Japanese business practices. They see a company that is succsessful but isn't giving customers what they want, they fill that demand and walk away (eventually) with all the marketshare. One only has to look at Cars, motorcycles, televisions etc. to see it. I am starting to think Steve Jobs is a douche for more reasons than flash. I am always reading articles about him having a little tantrum and banning this or that because someone made him mad. Grow up Steve and give your LOYAL customers what they want or we will go elsewhere when a viable alternative arrives. Don't fall into the, "The bigger they are the harder they fall" category. As for Flash, it may be a little buggy so let me download it at me own risk. Sorry, but Flash is everywhere and I am tired of not being able to view it. Someday that won't be the case but until then...let me have it. I'm going to spend $800 on an iPad when I can't view Flash content? Really? Sorry, not me. I will just stick with my trusty notebook.:apple:
shawnce
Sep 26, 11:01 AM
My 2.66GHz MacPro doesn't use all four cores except on rare occassions (e.g. benchmarks, quicktime, handbrake, etc.) and even then it doesn't peg them all.
In other words your average work load doesn't contain enough concurrent work items that are CPU bound.
What I'm most interested in is offloading OpenGL to a core, the GUI to another core, etc. ...some what a nonsensical statement...
Threads of work are spread across available cores automatically. If a thread is ready to run and a core is idle then that thread will run on that core.
Aspects of the "UI" frameworks are multithread and will automatically utilize one or more cores (in some cases the frameworks increase the number of threads they use based on how many cores exist in the system). In other words the UI will already potentially use more then one core on a multi-core system.
The same can happen with OpenGL either now... say if the game developer for example utilizes one or more threads to calculate the game world state and a second thread to call into OpenGL to render that game world ...or by enabling the multithread OpenGL render (only available on Mac Pro systems at this time).
Of course that assumes that the tasks you run are CPU intensive enough to even begin to consume compute resources available to you in new systems... in the end you should measure overall throughput of the work load you want to do, not how utilized your individual core are when doing that work load.
In other words your average work load doesn't contain enough concurrent work items that are CPU bound.
What I'm most interested in is offloading OpenGL to a core, the GUI to another core, etc. ...some what a nonsensical statement...
Threads of work are spread across available cores automatically. If a thread is ready to run and a core is idle then that thread will run on that core.
Aspects of the "UI" frameworks are multithread and will automatically utilize one or more cores (in some cases the frameworks increase the number of threads they use based on how many cores exist in the system). In other words the UI will already potentially use more then one core on a multi-core system.
The same can happen with OpenGL either now... say if the game developer for example utilizes one or more threads to calculate the game world state and a second thread to call into OpenGL to render that game world ...or by enabling the multithread OpenGL render (only available on Mac Pro systems at this time).
Of course that assumes that the tasks you run are CPU intensive enough to even begin to consume compute resources available to you in new systems... in the end you should measure overall throughput of the work load you want to do, not how utilized your individual core are when doing that work load.
Eidorian
Oct 26, 10:31 PM
Exactly
I hope Apple comes out with a single clovertown chip tower in 07 that runs on cheap standard DDR2 memory and maybe just one optical drive bay. I do like the 4 HD bays though.
On a side note, the people arguing that 8 cores is just too much power are pretty damn funny. There are thousands of people like multimedia that need more cores. I'm not one of them but at least I understand their need. Some poeple on here are clueless.I don't think Cloverton will run on standard DDR2. Kentsfield sure but doesn't Xeon REQUIRE ECC/FB-DIMM?
I hope Apple comes out with a single clovertown chip tower in 07 that runs on cheap standard DDR2 memory and maybe just one optical drive bay. I do like the 4 HD bays though.
On a side note, the people arguing that 8 cores is just too much power are pretty damn funny. There are thousands of people like multimedia that need more cores. I'm not one of them but at least I understand their need. Some poeple on here are clueless.I don't think Cloverton will run on standard DDR2. Kentsfield sure but doesn't Xeon REQUIRE ECC/FB-DIMM?
p0intblank
Sep 20, 08:01 AM
So it does include a hard drive? Very nice! I was already planning on purchasing an "iTV", but this just makes it sound that much cooler. :D
luci216
Apr 28, 08:34 AM
The top 3 also have much cheaper models than Apple.. which can contribute to their higher sale spots. Not many people are willing to sell out $1k for a computer, especially internationally. In Brazil, a MBP costs about $3k. DOLLARS. Not many people can afford that..
dr_lha
Sep 12, 03:45 PM
The speculation from my general area is that Apple will never (never say never, right..) make a DVR. It's not in their interest to make a DVR. There are several companies that are doing the DVR thing for Macs (el gato and Migila) and IMO, Apple shouldn't tread those waters.
As for a Tivo killer, there's too much going against it for Apple to do. First of all, to do a DVR right, it's going to cost the end user a ton of money. The Tivo Series 3 will cost $800 (less with rebates) plus the monthly fees. Tivo's going to have a tough time convincing people to buy the S3 when the cablecos have an option available for $10/month.
Here's what I would like Apple to do. Open up Front Row so that companies like el gato can integrate their eyeTV software into the Front Row system. That way, I can have a Mac sitting in the office with an eyeTV box to record HD programming off of cable. Then, I could have an iTV in my living room to play the recorded material onto my 46" LCD HDTV (which I haven't bought yet).
If I want, I could initiate a purchase of a movie from iTMS (provided the quality of the movies are good) from the iTV itself so that it downloads onto the Mac in the office. A rental plan would be even better. That way, I could completely isolate myself from the real world.
ft
Good to see some people around here "get it".
As for a Tivo killer, there's too much going against it for Apple to do. First of all, to do a DVR right, it's going to cost the end user a ton of money. The Tivo Series 3 will cost $800 (less with rebates) plus the monthly fees. Tivo's going to have a tough time convincing people to buy the S3 when the cablecos have an option available for $10/month.
Here's what I would like Apple to do. Open up Front Row so that companies like el gato can integrate their eyeTV software into the Front Row system. That way, I can have a Mac sitting in the office with an eyeTV box to record HD programming off of cable. Then, I could have an iTV in my living room to play the recorded material onto my 46" LCD HDTV (which I haven't bought yet).
If I want, I could initiate a purchase of a movie from iTMS (provided the quality of the movies are good) from the iTV itself so that it downloads onto the Mac in the office. A rental plan would be even better. That way, I could completely isolate myself from the real world.
ft
Good to see some people around here "get it".
Nermal
Mar 18, 04:51 PM
Second, it's a violation of DCMA.
Why? He's not breaking copy protection, because the protection wasn't there in the first place.
I can't believe that people think this is a bad thing. Don't you like freedom? :eek:
Why? He's not breaking copy protection, because the protection wasn't there in the first place.
I can't believe that people think this is a bad thing. Don't you like freedom? :eek:
edesignuk
Oct 8, 08:43 AM
That is exactly the weakness of the PC platform. It turns into a zoo where the monkeys and lions roam free and the people have to live in cages... :rolleyes:...but who has the market share?
Multimedia
Oct 25, 10:48 PM
If the pricing is any indication, the (low end) Quad Core 2.33GHz Clovertown is the same price as the (high end) 3.0GHz Dual-core Xeon...
so unless the bottom of the line Mac Pro is expected to start at $3298, the current Dual-Core Xeon Mac Pros will stick around.Right. According to Apple's current pricing, the 2.33GHz Dual Clovertown would be +$800 IF they offer it. However, Apple may only offer the 2.66GHz Dual Clovertown for + $1100 and keep the rest of the offerings priced as they are now.
That way they keep the top 8-core more expensive than any of the less expensive and way less powerful 4-core models. From a marketing point of view this makes a lot more sense to me - since I plan on buying the Dual 2.66GHz Clovertown for +$1100, total $3599 BASE or more if they insist. This is one time when I don't care how much it costs - I need it NOW.
so unless the bottom of the line Mac Pro is expected to start at $3298, the current Dual-Core Xeon Mac Pros will stick around.Right. According to Apple's current pricing, the 2.33GHz Dual Clovertown would be +$800 IF they offer it. However, Apple may only offer the 2.66GHz Dual Clovertown for + $1100 and keep the rest of the offerings priced as they are now.
That way they keep the top 8-core more expensive than any of the less expensive and way less powerful 4-core models. From a marketing point of view this makes a lot more sense to me - since I plan on buying the Dual 2.66GHz Clovertown for +$1100, total $3599 BASE or more if they insist. This is one time when I don't care how much it costs - I need it NOW.
Hisdem
Mar 15, 01:39 PM
Are you drunk?
Looks like it. And BTW, I don't think the Japanese people would think leaving their homeland and going to the USA is a good idea. Not saying they don't like the US, but generally, just generally, people tend to care more about their own countries and cultures than about the American ones. Just saying.
Looks like it. And BTW, I don't think the Japanese people would think leaving their homeland and going to the USA is a good idea. Not saying they don't like the US, but generally, just generally, people tend to care more about their own countries and cultures than about the American ones. Just saying.
Sounds Good
Apr 10, 09:24 PM
Not that this really matters much, but just for the record:
I was one of the first to own the original iPhone and have an iPhone 4 now. I bought an iPhone 4 for my wife and an iPod Touch for my son. I got my mom an iPad and I'm about to buy one for myself. So I'm certainly not anti-Apple. I'm just not sure I see a clear advantage FOR ME to get a Mac computer over a Windows machine.
But, who knows... maybe some day.
I was one of the first to own the original iPhone and have an iPhone 4 now. I bought an iPhone 4 for my wife and an iPod Touch for my son. I got my mom an iPad and I'm about to buy one for myself. So I'm certainly not anti-Apple. I'm just not sure I see a clear advantage FOR ME to get a Mac computer over a Windows machine.
But, who knows... maybe some day.
hulugu
Mar 14, 11:28 PM
There is absolutely no need to be insulting. Quote your "studies", first of all, but I find your assertion pretty bizarre as originally stated - mostly because Death Valley is almost entirely subsumed within Death Valley National Park. Unless you something we don't know, there is zero chance that you are going to be installing a 100 square mile solar array in the park. Not to mention the mountainous topography.
You're correct. It's useful to think of the area needed for solar power, but subsuming Death Valley with solar panels isn't a realistic solution.
Solar panels are a useful supplement to other power sources in certain regions where favorable environmental conditions exist. But no more than that I'm afraid.
I'm not sure why alternative energy sources are required to be a silver bullet in a way that other sources like nuclear, coal, and natural gas are not. The way to fill our energy needs is a death by a thousand cuts, which will include conservation and new technologies.
Energy should be localized to some degree, thus Iceland can use geothermal to its advantage, England can use wind and tidal, and Australia can use solar.
Finally, there is tremendous social, political, and economic pressure to continue using fossil fuels and nuclear energy rather than the alternatives. Even though alternatives are now more prevalent than before and enjoy increasing popularity, fossil fuel and nuclear energy are going to be used heavily until all the fuel is exhausted.
You're correct. It's useful to think of the area needed for solar power, but subsuming Death Valley with solar panels isn't a realistic solution.
Solar panels are a useful supplement to other power sources in certain regions where favorable environmental conditions exist. But no more than that I'm afraid.
I'm not sure why alternative energy sources are required to be a silver bullet in a way that other sources like nuclear, coal, and natural gas are not. The way to fill our energy needs is a death by a thousand cuts, which will include conservation and new technologies.
Energy should be localized to some degree, thus Iceland can use geothermal to its advantage, England can use wind and tidal, and Australia can use solar.
Finally, there is tremendous social, political, and economic pressure to continue using fossil fuels and nuclear energy rather than the alternatives. Even though alternatives are now more prevalent than before and enjoy increasing popularity, fossil fuel and nuclear energy are going to be used heavily until all the fuel is exhausted.
javajedi
Oct 11, 08:48 AM
Originally posted by ddtlm
javajedi:
Admittedly I am getting lost in what all the numbers people have mentioned are for, but looking at these numbers you have here and assuming that they are doing the same task, you can rest assured that the G3/G4 are running far inferior software. AltiVec and SSE2 or not, there is just nothing that can explain this difference other than an unfair playing field. There is no task that a P4 can do 11x or 12x the speed of a G4 (comparing top-end models here). The P4 posseses nothing that runs at 11x or 12x the speed. Not the clock, not the units, the bandwidth to memory and caches are not 11x or 12x as good, it is not 11x better at branch prediction. I absolutely refuse to accept these results without very substantial backing because they contradict reality as I know it. I know a lot about the P4 and the G4, and I know a lot about programming in a fair number of different languages, even some assembly. I insist that these results do not reflect the actual performance of the processors, until irrefutable proof is presented to show how they do.
I guess the 70 and 90 don't surprise me a lot for the G3/G4, depending on clock speed difference. But all this trendy wandwagon-esque G4-bashing is not correct just cause every one else is doing it. There are things about the G3 that are very nice, but the G4 is no slouch compared to it, and given the higher clock that it's pipeline allows, the G3 really can't keep up. The G4 not only sports a better standard FPU, but it also sports better integer units.
Keep in mind this test does not reflect balanced system performance. The point of this exercise has been to determine how the G4's FPU compares to an assortment of different processors and operating systems.
I'd like to know you you qualify "inferior software" on the x86. If the P4 is some how cheating, then all of the other processors are cheating as well. Again, we ran the exact same code. We even made it into C code on the mac for maximum speed. In fact I'd like for you to check the code out for yourself, so you can see there is no misdirection here. Keep in mind, other people here have ran it on Athlons in Linux and still get sub 10 second times. I've also had a friend of mine (who i can trust) run it under Yellow Dog on a G4, he got 100+ seconds. And I did not tell him the scores we've been getting on the Mac, I had him run the test first and tell me how long it took before I even said anything. The JRE and now Mac OS X have been factored out of this equation.
When you look at operations like these, for example scalar integer ops, that's all register. The fsb, bsb, or anything else doesn't matter. This is a direct comparison between the two units on the G4 vs everything else. Also, my question to you is, in what way are the integer and fpu units "better" in the G4? I did not build the chip so I can't say weather they are better or not better than those in the 750FX, but I can say I've ran a fair benchmark comparing the FPU on the G4 from everything to a P4, Athlon, C3, G3, different operating systems, on x86 Windows and Linux, and on the Mac, Mac OS X and Yellow Dog. The results are consistent across the board. What more "proof" do you want?
javajedi:
Admittedly I am getting lost in what all the numbers people have mentioned are for, but looking at these numbers you have here and assuming that they are doing the same task, you can rest assured that the G3/G4 are running far inferior software. AltiVec and SSE2 or not, there is just nothing that can explain this difference other than an unfair playing field. There is no task that a P4 can do 11x or 12x the speed of a G4 (comparing top-end models here). The P4 posseses nothing that runs at 11x or 12x the speed. Not the clock, not the units, the bandwidth to memory and caches are not 11x or 12x as good, it is not 11x better at branch prediction. I absolutely refuse to accept these results without very substantial backing because they contradict reality as I know it. I know a lot about the P4 and the G4, and I know a lot about programming in a fair number of different languages, even some assembly. I insist that these results do not reflect the actual performance of the processors, until irrefutable proof is presented to show how they do.
I guess the 70 and 90 don't surprise me a lot for the G3/G4, depending on clock speed difference. But all this trendy wandwagon-esque G4-bashing is not correct just cause every one else is doing it. There are things about the G3 that are very nice, but the G4 is no slouch compared to it, and given the higher clock that it's pipeline allows, the G3 really can't keep up. The G4 not only sports a better standard FPU, but it also sports better integer units.
Keep in mind this test does not reflect balanced system performance. The point of this exercise has been to determine how the G4's FPU compares to an assortment of different processors and operating systems.
I'd like to know you you qualify "inferior software" on the x86. If the P4 is some how cheating, then all of the other processors are cheating as well. Again, we ran the exact same code. We even made it into C code on the mac for maximum speed. In fact I'd like for you to check the code out for yourself, so you can see there is no misdirection here. Keep in mind, other people here have ran it on Athlons in Linux and still get sub 10 second times. I've also had a friend of mine (who i can trust) run it under Yellow Dog on a G4, he got 100+ seconds. And I did not tell him the scores we've been getting on the Mac, I had him run the test first and tell me how long it took before I even said anything. The JRE and now Mac OS X have been factored out of this equation.
When you look at operations like these, for example scalar integer ops, that's all register. The fsb, bsb, or anything else doesn't matter. This is a direct comparison between the two units on the G4 vs everything else. Also, my question to you is, in what way are the integer and fpu units "better" in the G4? I did not build the chip so I can't say weather they are better or not better than those in the 750FX, but I can say I've ran a fair benchmark comparing the FPU on the G4 from everything to a P4, Athlon, C3, G3, different operating systems, on x86 Windows and Linux, and on the Mac, Mac OS X and Yellow Dog. The results are consistent across the board. What more "proof" do you want?
Niciun comentariu:
Trimiteți un comentariu