Forums

Topic: PC vs X1X Cost Effectiveness

Posts 101 to 120 of 192

Octane

@NEStalgia I looked it up and I guess I was wrong. They aren't necessarily doing math with prime numbers, they aren't even doing complicated math. It looks looks complicated because they're working with 64-digit hexadecimal numbers. And it's essentially a guessing game. But the bigger you GPU, the more numbers you can guess. I stand corrected, it's not difficult equations, it's a simple guessing game anyone could do, it's just that computers are a lot faster.

I don't see how it could be used for anything else though.

Why they're doing it this way? Probably to ensure that you need to do a lot of work for little reward, meaning you spend a lot more time verifying transactions than you would if they handed it out to everyone after they verified a certain amount of data.

Octane

NEStalgia

@Yorumi Yeah...more specifically....more OSes need to support the hardware. Google's OS certainly will always lock you into Google's store, and pure android is just another BusyBox Linux branch distro more or less with the full package repository plus whatever you compile yourself....so I think we're mostly talking about getting Windows to support ARM. Last time MS tried that it didn't go so well......actually it was an experience of miserable suffering. Or we need x86 phones. And oven mitts. Or an all together new OS. But then we still don't get Steam games

@Octane Hmm....this still sounds like key generation. Not encryption breaking per-se, but brute forcing either full keys or segments. Does it require uploading the results I imagine? Wonder what it feeds them into.

Seems like there's either a not-so-hidden purpose, or the whole thing is just for the lulz
Untitled

NEStalgia

Octane

@NEStalgia No, because the answer is already known. The computers are guessing until they get it right. But the answer has to exist to compare the results with in the first place. That's why it can't be used to decode anything. It's called a proof-of-work, but the definition on Wikipedia sounds a bit like mumbo-jumbo to me; ''...an economic measure to deter denial of service attacks and other service abuses such as spam on a network by requiring some work from the service requester, usually meaning processing time by a computer.''

Octane

NEStalgia

@Octane The more I think about it the more absurd it sounds. There's no practical purpose. They invented busy work in an over complicated fashion and people scramble to blow fortunes on hardware to enter to win a lottery (imagine on pay day you're told "sorry, you didn't win the paycheck this week, better luck next week!")

If it's not nefarious in intent and is just busy work, I'm assuming these guys own the manufacturers of GPUs in big blocks of shares

NEStalgia

Octane

@NEStalgia Well, it's not just busy work. Remember, the important bit is that you're verifying transactions. And even though a day's work my not result in getting a single bitcoin (no idea how long it takes); however, if you do get one, you're getting $14,000. And that's why people are doing it.

Octane

RancidVomit86

NEStalgia wrote:

Wow, 1080s are selling for between $700-1400 PER CARD?! I could buy a few decades of XBL subscriptions for that

I have had an RX 480 since close to time they launched and it's been great. Handles everything I have tried on ultra, 60+ FPS but even prices on it have been drove up like crazy due to miners. 1050 ti/RX 560 is the sweet spot right now for performance at a good price.

Battle.net - Dayman
Steam - RancidVomit86
PSN - RancidVomit86

Where my friends and I usually get stupid:
https://www.twitch.tv/MUDWALLHOLLER - Come by hang and visit our Discord. The link for Discord is on the Twitch page.

Let's Go Buffalo!

PlywoodStick

@Yorumi I would totally disagree for the late 90's-mid 00's, AMD had very competitive products back then; even superior products in many cases. It's just the 2007-2016 timeframe that AMD couldn't offer much in the CPU market (because they had almost no bank left after Intel was through with them).

https://www.anandtech.com/show/1211

AMD sold the first consumer market 64-bit processors- remember how 14+ years ago, Intel's Pentium 4 was 32-bit? Athlon 64 blew away the Pentium 4 in performance. The only reason they didn't sell well is because Intel rigged the market so OEM's like Dell wouldn't get subsidized if they sold builds with AMD in them. Many started to eventually cave in and started offering more AMD powered rigs by the time dual cores came out around 2005. Speaking of which:

https://www.anandtech.com/show/1676

https://www.pcworld.com/article/117654/article.html

https://phys.org/news/2004-08-amd-industry-x86-dual-core-proc...

https://forums.anandtech.com/threads/core-2-duo-or-core-2-qua...

AMD sold the first consumer market bonefide dual core processors, and they weren't just two Pentium 4's "glued together" like the Pentium Dual was, or two Core 2 Duo's "glued together" like the Core 2 Quad was; granted, the Core 2 Quad was a great deal for the time. (Funny how Intel now says AMD's server CPU's are inferior because they're "glued together" with Infinity Fabric, when it's not even using a monolithic die like their CPU's are.) And of course, the Athlon 64x2 also blew away the Pentium Dual in performance. But consumers (especially enthusiasts) still weren't buying them in great enough numbers for a number of reasons, all going back to Intel's mindshare over the market being too strong. Same thing happened with AMD's Turion laptop/mobile procs.

https://www.anandtech.com/show/1209

https://www.anandtech.com/show/2978/amd-s-12-core-magny-cours...

With the original Opterons in particular, AMD actually had completely superior server CPU's over Intel's Itanium from 2003-2005. No contest whatsoever. It wasn't until 2006 when Intel actually offered up some kind of resistance. And even then, Opterons continued to be competitive in performance up through 2010. Yet businesses STILL bought Intel's technically inferior Itanium during 2003-2005 just because of their market position; and again, Intel doing their thing and rigging the market.

https://www.anandtech.com/show/2378

https://www.extremetech.com/computing/81242-amd-phenom-9600-v...

It wasn't until 2007 that AMD started to fall apart, because, well... not enough money left. It's a wonder they didn't go bankrupt- probably because Intel never finished them off, to prevent being broken up for being a true monopoly. (Although they mostly were a monopoly anyways.)

This time around though, with Intel having become complacent after the Core i 1st and 2nd generations kicked @$$, AMD kickstarted progression again with finally making hexa-core and octa-core CPU's mainstream, with very good price/performance. That's why Intel was forced to release Kaby and Coffee Lake within the same year. (And according to motherboard manufacturers, this wasn't too difficult because the Coffee Bridge boards/procs use an unused pin space that Kaby Lake boards/procs didn't use, meaning Kaby Lake mobo's SHOULD technically be compatible with Coffee Lake CPU's, but Intel prevented that from happening. That's a whole other can of worms!)

The price/performance king of 2017 was the Ryzen 5 1600, and this year's procs are looking to give Coffee Bridge some much needed competition as well! Competition is good. AMD doesn't have to win, they just need to be able to keep Intel in check.

Edited on by PlywoodStick

PlywoodStick

PlywoodStick

@Yorumi If they do, it could use a mobile Ryzen CPU/Vega GPU to start off with, kind of like the upcoming laptops, but perhaps also like the upcoming Intel Coffee Lake-G series, which is going to have an Intel CPU packed alongside an AMD Vega GPU. Pretty crazy, almost unbelievable considering all their history, but one thing's for sure about the recent diplomacy: consumers win. Some much needed competition is being served up against NVIDIA, as well.

Edited on by PlywoodStick

PlywoodStick

RancidVomit86

Also remember we are going to be getting Intel chips with Vega GPU on board.

https://www.theverge.com/circuitbreaker/2018/1/7/16861164/int...

Battle.net - Dayman
Steam - RancidVomit86
PSN - RancidVomit86

Where my friends and I usually get stupid:
https://www.twitch.tv/MUDWALLHOLLER - Come by hang and visit our Discord. The link for Discord is on the Twitch page.

Let's Go Buffalo!

RenderSpotlight

I have not read the whole thread, but one thing about either the PC or the Xbox that seems like a common trend and might be an advantage for either platform is cross play with the Switch. Obviously Minecraft is the big one, but it seems like Microsoft and Nintendo might be willing to work together to allow cross play with games. Rocket League is another example too.
That is a win for all players with a Switch, Xbox or PC or VR or it seem anything except a PS4. Because it seems that the Sony universe appears to be less friendly with the cross play.
So where Sony is locking things down and playing hardball, the others seems to be willing to open things up and be willing to play ball.
I owned a PS3 over a Xbox360, but I did not continue over to the PS4. Even though I owned a PS3, I still found that I played more Nintendo than anything else. So, most likely I will only stick with the Switch and PC myself. However, the Switch will get the vast majority of my gaming time. The PC has never really been my preferred gaming platform, but as soon as I get a dedicated one hooked up to my TV, that might change a little, but I suspect not by much. Although I am not in any hurry. My PC gaming needs are pretty low.

Edited on by RenderSpotlight

RenderSpotlight

SharkAttackU

fwiw I have a One S and adore it. My Wii U and Switch are Nintendo machines and all third party games are on my Xbox. Plus I love Halo and Gears. I gamed on PC from the late 90s to about 2008. I was just sick of all the issues common with PC gaming. My 3 year old rig had a bad component my life was starting to get busy and i was frustrated and just went back to consoles. I've never looked back.

Consoles still offer greater convenience and ease of use over PCs. Plug and play. Yes you have to install and update, but after that everything is easy. If you have good internet and buy physical, it's not a problem.

$60 per year is cheap. You get 4 free games every month and access to good sales. Live is a great service.

You certainly won't build a comparable PC from scratch for cheaper. Plus you're talking about many hours of work to get a PC up and running well. Plus unboxing a new console is always a special experience.

SharkAttackU

NEStalgia

@Octane You're still doing "work" to "verify" transactions using imaginary money based on imaginary value, which increases in value, because of the work that exists only to validate itself. It's like getting paid 1 million Fibaroons for your work spending all month counting Fibaroons. Except for some reasons there's morons out there that will actually give you $5 for a Fibaroon. $5 that was created by smelting iron or cooking pancakes. Or counting dollars, but who takes bankers seriously?

@Yorumi I'm not sure about the obsolescence of x86 or Windows. I won't argue it's a horrible architecture, it is. But proprietary business software depends so much on the "Wintel" architecture, and the longer proprietary business software depends on it, the more proprietary software is written for it, and the dependence grows rather than withers. As long as business depends on it, so does everyone else. Meanwhile game consoles that were never x86 now are. Apple moved from RISC to x86 ages ago (what's next, x86 smartphones that I was joking about but maybe wasn't?) And the same argument goes from windows. Heck, industry revolted when Microsoft tried to shift away from native software and the desktop with 8, because the whole POINT of using windows is that backward compatibility. At one point I thought like you did, but I think I can see that it's not going to move any time in my lifetime.

Besides, like you said if we were to see such a shift we'd see a move to 5 year shifts (like it was before x86 + Windows standardized it. C64, Amiga, TI99, Spectra, Sparc, etc.) Oh please don't make me go back to Banyan Vines.....not ever......

Steam Linux never really went where they hoped. They wanted 100% Linux compatibility for Linux and SteamOS.....and it never got close. No doubt XBox's push to keep DX relevant worked very well.

I do see that tech/hardware shift happening. But I think we have to get there with x86 and Windows in tact. And Intel has a LOT of work to do to make that happen. And they seem distracted this week.

NEStalgia

NEStalgia

@PlywoodStick That goes back to what I said. AMD didn't have great products, it's just that Intel went through a period of products so bad (and overpriced) it made AMD look good, before they got their act back together.

Starting with C2D era and then going against Xeon the big reason to NOT go with AMD wasn't the CPU but the HORRIBLE HORRIBLE chipsets. When nVidia Sandforce was your best chipset you were going to be in for a rough ride. AMD CPUs were still decent but they were stuck on hot, buggy, latent, motherboards. Even I bit the bullet and just started paying more for Intel again at the time, and never regretted it. AMD stuff felt laggy even if it was more powerful by the spec.

I get skeptical with AMD only because they can produce good things that super enthusiasts on the internet love and on numerical benchmarks look night and day but there's almost always one or more gotchas that make the total experience far less than the hype appears (not that I love Intel's monopoly.) Still, Ryzen makes me think of zombies. Coffee Lake makes me think of coffee. Given a choice between zombies and coffee I have to pick coffee. Coffee wins twice a day every day. I'll go into shock if my blood-coffee level drops below 70%. A CPU made of coffee may help make my life fulfilling.

@Yorumi the interface would be weird, though. Intel owns Thunderbolt, and so it's only on their supported chipsets. It would need to be some PCI-E contraption to get it to work on AMD (or an Intel licensed TB controller.)

@sharkattacku All of that is so very true....sounds a lot like my history but with PS instead of XB (well, I had PS3+X360, but so far only PS4 and no X1.)

NEStalgia

PlywoodStick

@NEStalgia Interesting... I've gone through literally thousands of Core 2 Duo/Quad and hundreds of AMD Athlon64x2/Phenom systems over the years at work, and by far the most common critical failure problem I've seen motherboard wise is bursted capacitors... almost entirely on OEM Intel-based mobos. Especially those mass produced by Dell for government/educational organizations. By God are there thousands upon thousands of bursted capacitors on mobos in Core 2 era Dell systems. I remember one day a few years ago where I went through about 100 Dell systems with Core 2 Duo/Quad, the vast majority of which had mobos with bursted capacitors. I could've counted how many could actually be refurbished on two hands due to that, so it didn't even take a full workday to finish them! Granted, I've seen both Intel and AMD mobos fail for physical, electrical, and... inexplicable reasons, but I've found that AMD systems from that time period are less likely per capita to encounter fatal issues. (But then again, they have an exponentially smaller sample size, so perhaps that's colored my perception! )

As for what you're mentioning... taking part in both refurbishing and dismantling, again, literally thousands of computers over the years, I haven't noticed a major stability or heat disparity between Intel and AMD parts throughout the 00's era in general. In particular, overheating factors have been independent from the manufacturer, per se. (Although the Intel systems have run into heat issues more often per capita for any number of reasons, but again, far larger sample size, so my perception may be affected.) Perhaps you're just taking to heart the old "AMD runs hot" trope, or perhaps your curse just ran wild... But in any case, in my experience, for the vast majority of the time, both Intel and AMD systems have had decent stability and heat management profiles overall.

I went with Ryzen this time because it was simply a good deal in my case. And I wasn't going to settle for a quad-core anymore. Hexa-core Ryzen 5 1600 for $220, $50 off AM4 mobo with CPU at launch (went for $85 ASRock AB350 Pro4, so a whopping $35 for decent budget mobo, I'm not going above 3.8 GHz anyways), $120 ($105 on sale) for 16GB DDR4-2400 before the big DDR4 price hike (board only ramps up to 2666 MHz anyways), empty Thermaltake Soprano Snow case donated to my work, also 750W 80Plus Bronze power supply that was donated which I tested, blah blah lots of fans and pain finding the right screws and attachments blah; bottom line is it worked out in my case! Contributing towards bringing back competition in the marketplace is just a nice bonus!

As for your case, if you do go for a desktop Coffee Lake, please please pay extra for a K model and over $100 mobo. Even if you're not overclocking. The non-K models have noticeably worse performance than the K variety this time around, even if you don't overclock. If you use a budget mobo with lesser VRM's costing less than $100, it'll still handle base spec, but not above that on more than 1 core, so the mobo's value will degrade more quickly over time. The K models and mid-upper range mobo's should hold their value and performance over time much longer.

Edited on by PlywoodStick

PlywoodStick

Octane

@NEStalgia Whether it's imaginary or not doesn't matter. We value $100 bills as.. well.. $100, even though it's just a piece of paper. Yes, it's physical, but terms of intrinsic value, it's just a piece of paper. The same is true for digital currency. It doesn't matter if it's just bits and bytes floating around the web, the intrinsic value doesn't matter if people can use it to trade. Bank workers also get paid in the currency they're dealing with on a daily basis, the concept isn't strange. The only difference is that bitcoin doesn't need the people to handle the transactions, they only need computers.

Octane

LeopardSon

Sorry man, simply put, X1X is a waste, completely, esp. if you have a ps4 pro....AND a switch

PC is the final frontier but IMO isn’t as fun as Console , but MS is a waste and it’s a weak console

It’ll be dropped by next gen I think if things keep going this way, it’ll be Nintendo and Sony , and then PC

LeopardSon

PlywoodStick

@Octane @NEStalgia If you ask me, the USA sealed it's fate when the REAL gold standard was given up for fiat currency that can be inflated and artificially boosted to support the dangerously volatile petrodollar. Real materials like gold have high value because of their scarcity and highly useful physical and chemical properties, moreso now than in the past. They are finite- one cannot grow beyond a certain point with a finite resource.... but that didn't stop top economists and big bankers from seeking endless growth. (Which is unsustainable in a finite reality.)

If you ask me, giving up our current concept of currency as wealth in exchange for the concept of (renewable) energy as wealth is the future. If and when that happens, certain materials like gold will still have high value, but based moreso on what it takes to utilize it efficiently and effectively, rather than just having a lot of it sitting around in a vault doing nothing. The fiat currencies will eventually all crash and burn. The petrodollar will not last forever. And cryptocurrency will become the new way to manage and trade energy credits. Best to start developing them now before the fiat currencies die off. It's in it's infancy now, but everything has to start from somewhere.

Edited on by PlywoodStick

PlywoodStick

NEStalgia

@PlywoodStick Well those caps are hardly Intel's fault. That era saw an influx of bad caps in general, and lots of mfrs got bit by them. In a normal world they would have been recalled and removed from the market. Instead they just kept using them and RMAing (or not.)

I was buying only boards with solid caps at the time as a result

Heat, I'm not sure all the (non-GPU) failures were heat related, but they did tend to run quite a bit hotter than their C2D and later equivalents (Note I'm not talking about Coppermine/P4.....those things threw absurd heat.) The failures and problems were mostly the buggy chipsets AMD was stuck with in the era. Again....Sandforce as a major one....that was just not pretty.

Hexacore.....I'm still not sold yet for non-server loads. Heck most games only recently started using any threading at all. I'm not sure on a gaming box of the real ability to saturate more than 4 cores yet. Server loads, sure....but that's a whole other situation. I mean it can't hurt, but I'm uncertain that's a tremendous help at present either. Not saying Ryzen isn't good...just that I'm not sure hex/octocore from ANY vendor yields much benefit to a gaming rig. CAD station? Sure. Photoshop station? Sure. RDBMS? Sure. Gaming? Not so sure.

Actually I was half joking about coffee lake....just the name coffee I'd actually go for something older if not jumping on the ryzen bandwagon. I imagine even Skylake would represent good value for a CPU for gaming. Especially with the Spectre dumping likely to happen Good advice though!

Gold standard, I definitely agree....that's similar to what I said above. Not sure I agree on "energy credits", itself a scheme rooted in Enron. There's no reason not to return to using very tangible metals as a base, as they are indeed finite, and represent actual necessary components to make the world run. There's a reason China is and has been hoarding rare earth and copper. "energy" is far too nebulous, and worse, allows for currency to be tied to the means of daily life rather than the means of building the infrastructure. It would be like basing currency on bread and water.

NEStalgia

ThanosReXXX

@NEStalgia Aaaannnnddd..... the Bitcoin has collapsed....
Well, at least, now that the European banking institutions are discussing applying stricter rules and regulations to this alien currency, the stock market has literally plummeted and the Bitcoin is now worth only half of what it was last month. This was just on the news over here. So, we can now skip that part of the discussion because we can definitely deem it not worthy to step into anymore.

As I had always expected, by the way. We'll see a lot more of those violent up and down motions were this highly questionable currency is concerned, so this may be the first, but it will more than likely not be the worst dive it is going to take...

'The console wars are like boobs: Sony and Microsoft fight over which ones look the nicest and Nintendo's are the most fun to play with.'

Nintendo Network ID: ThanosReXX

PlywoodStick

@NEStalgia @ThanosReXXX

http://www.businessinsider.com/why-bitcoins-astonshing-price-...

Cryptocurrencies have heavily collapsed before from being built up into bubbles that popped, they'll pick back up again eventually. The progress of blockchain technologies and algorithms will survive, regardless of which individual cryptocurrencies may wane or die over time. The reason for the pop this time around has multiple reasons, but once again, one of the primary reasons is because the big banker fiat currency overlords are getting antsy and ramping up their pressure, with cryptocurrency starting to hold too much sway in the real world for their liking. This was perhaps most publicly seen in South Korea, where they're really bringing the hammer down and trying to instill fear.

Edited on by PlywoodStick

PlywoodStick

This topic has been archived, no further posts can be added.