May Mining Cryptocurrency Damage My GPU? - U.Today

Reality Distortion Field

Ok fantards, I'm sure your egos are way too fragile to actually break from the fanboy narrative here but we are definitely going lower. But, look at all the upgrades! Yeah, and we're going lower, I figure $25.00 is a nice re-entry point. There are a lot of reason for this, but look at the macro. This is the top for now, not just in AMD but also the general market. Everyone is on pins and needles waiting for another rate cut. Why? Because the economy is propped to a ridiculous level. No rate cut? It crashes. Rate cut without promise of future cuts? It crashes. Fed injecting massive amounts of liquidity? Check. Manufacturing jobs disappearing? Check. All time high after all time high? Check. The global economy not doing so hot and we're ignoring it completely? Check. Smart money is exiting and securing their short positions as we speak. They're buoying the market just long enough to set themselves up for the retracement. I was around for 2008. Everything was wonderful, totally rock solid until it wasn't. And I'll admit that we aren't looking at another 2008 but we are looking at a helluva correction. Ironically, AMD will be ok but not until next year.
On that front, what do we have? Highest revs since 2007. Yay. See the irony? Rollout is too slow, PE is too high (yes it matters, especially when you're dealing with a manufacturer), too much hype around the CEO and too many new shares being dumped into the market. Yes, they are diluting, look at the numbers. I remember when they sought authorization for share issuance and all the usual fantards here started dumb-shaming anybody on here who dared question the notion that they might dilute. They were doing it "just so that they have the option of doing it" or some kind of nonsense like that was whet they said with indignant sanctimony. Well, they have been diluting and still are. It's not that bad compared to what it could be but it's still there and it affects the share price. Intel has been buying back shares and so has Nvidia.
https://ycharts.com/companies/AMD/stock_buyback
But don't worry, help is on the way. I think that they actually guided too conservatively for next quarter actually. I think that in the end AMD's superior tech will win decisive battles for market share. And, even though most of the "very smart" people on here throw a tantrum whenever I mention this: Crypto will be resurgent and AMD will benefit directly from it. The Ethereum mining hardware pool is diverse, they don't want just ASICs for a host of reasons. You can dismiss this out of hand at the behest of your own arrogance but it's the truth. it will make a difference in the bottom line. You're looking at massive crypto gains in 2020. I'm not going to explain why because he does it a lot better:
https://www.tradingview.comfilbfilb/
I picked up GBTC when Bitcoin was at 7500 and just sold it at 9400. I'm waiting for 8250 range to re-enter. But when it blows up it will take GPU mine-able alts with it. And if you think that these miners aren't already anticipating this and aren't accumulating cards right now to avoid profiteering when Bitcoin breaks to the upside then you're delusional. Why do you think Radeons are "Selling like hot cakes"? The tunnel vision here is amazing. Whatever the mainstream narrative is you guys eat it up. Stop being such a bunch of fanboys.
submitted by rantus to AMD_Stock [link] [comments]

Vertcoin Mining AMA

What is Vertcoin?

Vertcoin was created in 2014. It is a direct hedge against long term mining consensus centralization on the Bitcoin mining network. Vertcoin achieves its mining consensus solely through Graphics Cards as they are the most abundant / widely available consensus devices that produce a reasonable amount of hashrate. This is done using a mining algorithm that deliberately geared against devices like ASICs, FPGAs and CPUs (due to botnets) making them extremely inefficient. Consensus distribution over time is the most important aspect of a blockchain and should not be taken lightly. It is critical that you understand what blockchain specifications mean/do to fully understand Vertcoin.

Mining Vertcoin

When users of our network send each other Vertcoin, their transactions are secured by a process called mining. Miners will compose a so-called block out of the pending transactions, and need to perform a large number of computations called hashes in order to produce the Proof-of-Work. With this Proof-of-Work, the block is accepted by the network and the transactions in it become confirmed.
Mining is essentially a race. Whoever finds a valid Proof-of-Work and gets the block propagated over more than half of the Vertcoin network first, wins this race and is allowed to reward themselves with the block reward. The block reward is how new Vertcoin come in circulation. This block reward started at 50 VTC when Vertcoin was launched, and halves every four years. The current block reward is 25 VTC.
Vertcoin's One Click Miner: https://github.com/vertcoin-project/One-Click-Minereleases
Learn more about mining here: https://vertcoin.org/mine/
Specification List:
· Launch date: Jan 11, 2014
· Proof-Of-Work (Consensus Mechanism)
· Total Supply: 84,000,000 Vertcoin
· Preferred Consensus Device: GPU
· Mining Algorithm: Lyra2REv3 (Made by Vertcoin)
· Blocktime: 2.5 minutes
· SegWit: Activated
· Difficulty Adjustment Algorithm: Kimoto Gravity Well (Every Block)
· Block Halving: 4 year interval
· Initial Block Reward: 50 coins
· Current Block Reward: 25 coin
More spec information can be found here: https://vertcoin.org/specs-explained/

Why Does Vertcoin Use GPUs Then?

ASIC’s (Manufactuer Monopoly)
If mining were just a spade sure, use the most powerful equipment which would be an ASIC. The problem is ASICs are not widely available, and just happen to be controlled by a monopoly in China.
So, you want the most widely available tool that produces a fair amount of hashrate, which currently manifests itself as a Graphics Card.
CPUs would be great too but unfortunately there are viruses that take over hundreds of thousands of computers called Botnets (they’re almost as bad as ASICs).

Mining In Pools

Because mining is a race, it’s difficult for an individual miner to acquire enough computational power to win this race solo. Therefore there’s a concept called pool-mining. With pool-mining, miners cooperate in finding the correct Proof-of-Work for the block, and share the block reward based on the work contributed. The amount of work contributed is measured in so-called shares. Finding the Proof-of-Work for a share is much easier than finding it for a block, and when the cooperating miners find the Proof-of-Work for the block, they distribute the reward based on the number of shares each miner found. Vertcoin always recommends using P2Pool to keep mining as decentralized as possible.
How Do I Get Started?
If you want to get started mining, check out the Mine Vertcoin page.

Vertcoin just forked to Lyra2REv3 and we are currently working on Verthash

Verthash is and was under development before we decided to hard fork to Lyra2REv3. While Verthash would’ve resulted in the same effect for ASICs (making them useless for mining Vertcoin), the timeline was incompatible with the desire to get rid of ASICs quickly. Verthash is still under development and tries to address the outsourcability problem.
Verthash is an I/O bound algorithm that uses the blockchain data as input to the hashing algorithm. It therefore requires miners to have all the blockchain data available to them, which is currently about 4 GB of data. By making this mining data mandatory, it will become harder for auto profit switching miners — like the ones that rent out their GPU to Nicehash — because they will need to keep a full node running while mining other algorithms for the moment Verthash becomes more profitable — the data needs to be available immediately since updating it can take a while.
Over the past month, we have successfully developed a first implementation of Verthash in the Vertcoin Core code base. Within the development team we have run a few nodes on Testnet to test the functionality — and everything seems to work properly. The next step is to build out the GPU miners for AMD and Nvidia. This is a NOETA at the moment, since we’re waiting on GPU developers which are in high demand. Once the miners are ready, we’ll be releasing the Vertcoin 0.15 beta that hardforks the testnet together with the miners for the community to have a testrun. Given the structural difference between Lyra2RE and Verthash, we’ll have to run the testnet for a longer period than we did with the Lyra2REv3 hard fork. We’ll have to make sure the system is reliable before hardforking our mainnet. So the timeline will be longer than with the Lyra2REv3 hard fork.
Some people in the community have voiced concerns about the fact that Verthash development is not being done “out in the open”, i.e.: the code commits are not visible on Github. The main two reasons for us to keep our cards to our chest at this stage are: (1) only when the entire system including miners has been coded up can we be sure the system works, we don’t want to release preliminary stuff that doesn’t work or isn’t secure. Also (2) we don’t want to give hardware manufacturers or mining outsourcing platforms a head start on trying to defeat the mechanisms we’ve put in place.

Links and Resources

· Twitter: https://twitter.com/Vertcoin
· Donations: vertcoin.org/donate
· Join our Discord: https://discord.gg/vertcoin
· Reddit: https://www.reddit.com/vertcoin/
· Official Website: https://vertcoin.org/
· Facebook: https://www.facebook.com/vertcoin
· Vertcoin Talk: https://soundcloud.com/vertcoin-talk
· Youtube: https://www.youtube.com/vertcoin
submitted by Canen01 to gpumining [link] [comments]

What is ProgPoW? Why Ethereum needs it moving forward.

Update: ASIC Manufacture say they can make a ProgPoW ASIC

Disclosure, I'm a avid GPU miner with some 90 Nvidia GPUs running out of my garage. I've been in and out of the mining scene since 2011,2014, and recently 2017. I Hold BTC, ETH, RVN. I directly benefit from them moving to ProgPOW, but not without a good reason. Everytime I've gotten into home GPU mining ASICs comes out BTC, LTC, I've had to give up every time. I refuse to see it happen to another excellent coin.

I've been a proponent of Ethereum following there ASIC resistance stance outlined in the original white-paper. Now that ProgPOW has been given the "Green-light" by Hudson Jameson to move forward with ProgPOW. I really think its time to discuss the Algorithm. What it is, who created it, why Ethereum needs it and dismiss crazy theories such as Nvidia funding development.

Before we start highly suggest everyone watch BitsBeTrippin's video where she breaks down ProgPOW at devcon4.

A Quick breakdown of What is ProgPOW?
ProgPoW is a proof-of-work algorithm designed to close the efficency gap available to specialized ASICs. It utilizes almost all parts of commodity hardware (GPUs), and comes pre-tuned for the most common hardware utilized in the Ethereum network.

From reading the white paper listed on Github the main idea behind ProgPOW is NOT to achieve total ASIC-resistance. The idea is to kill the 50-1000x Efficiency gains from specialized ASIC hardware. Such as what we saw recently with Equihash 200/9 coins where 50x was achieved over GPUs. ProgPOW algorithm uses most of the GPU minus a few parts. It takes the original Eth-Hash algorithm and add more features.
The main elements of the algorithm are:
ProgPOW will Inherit Eth-Hash current DAG size meaning 2GB and 3GB will not be able to mine still. Additionally no advantage is given to Either Nvidia or AMD GPUs
ProgPoW has been designed to be a vendor-neutral proof-of-work, or more specifically, proof-of-GPU. ProgPoW has intentionally avoided using features that only one core architecture has, such as LOP3 on NVIDIA, or indexed register files on AMD.

According to Kristy, she has had direct contact with AMD and Nvidia on testing ProgPOW.
As part of its review process, ProgPoW was submitted to (and reviewed by) both AMD and NVIDIA engineers. The group known as IfDefElse — of which I am a part of — has been actively working with both companies to ensure this effectively closes the efficiency gap that we speak publicly of in our papers and articles
This does not mean one side is favored over the other. She's giving and getting input from the major GPU manufactures in order to support Crypto-mining. Additionally she says "AMD is actively working with us to optimize ProgPoW for their architectures.". Using ProgPOW optimized for GPUs rids us of bowing to Bitmain, innosilicon, halong and there scandalous ways for hardware.

ProgPOW IS NOT the "God-sent savior of all GPUS" Even Kristy understand that complete ASIC-resistance is a fallacy. This will never be achieved. However By working with GPU manufactures and Crypto Dev's we can make a coin where GPUs run along-side with ASICs, but the efficiency gains are diluted. Meaning the time and money invested into an ProgPOW ASIC machine does not make economical sense. Rather just buy the actual GPU.

Quote sources from Kristy's Medium article.

Why does Ethereum need ProgPOW?

I suggest reading Siacoin's good medium article on the subject of ASICs.
It's too much to cover here but in short why we need ProgPOW against current ASICs and future ASICs
At his point in time we actually don't need ProgPOW. However we do need it as time goes on. Early Bitcoin ASICs didn't dominate BTC however as time went on, they became better more efficient than GPUs, and started dominating BTC's network. The same fate happens to any "ASIC-Resistant coin" that decides it's not a big deal (looking at you ZEN). Without a set date on POS Ethereum would have suffered the same fate. As Siacoin Dev states;
We also had loose designs for ethash (Ethereum’s algorithm). Admittedly, ethash was not as easily amenable to ASICs as equihash, but as we’ve seen from products on the market today, you can still do well enough to obsolete GPUs.
What makes ASICs bad? Isn't it better to get Hash/watt ratio? This saves tons of electric. One of PoW biggest faults. I think there is nothing bad about the ASICs hardware. Equihash ASICs achieved 20 1080ti level hashrate at 1/20 of the power. That's impressive. The problem with ASIC hardware is who, where it comes from, and there shady business practices.

  1. "It’s estimated that Monero’s secret ASICs made up more than 50% of the hashrate for almost a full year before discovery, and during that time, nobody noticed." How much of ETH hashrate could be ASICs? We won't know till the fork.
  2. I've heard a lot that ASICs aren't all that big of a deal. Focus on POS. Take in account Siacoins own network hashrate which allowed bitmain/innosilicon ASICs on the network till they forked in favor of their own ASICs after just a year (Siacoins drops 96% network hashrate).
  3. "In the case of Halong’s Decred miner, we saw them “sell out” of an unknown batch size of $10,000 miners. After that, it was observed that more than 50% of the mining rewards were collecting into a single address that was known to be associated with Halong, meaning that they did keep the majority of the hashrate and profits to themselves." GPU manufactures would not and cannot be do the same.
ASICs destroy networks, centralize the pools, and hardware. Leading to them to be controlled by large entity in this case its Chinese companies. Anyone who thinks otherwise is fool. Of course this doesn't happen overnight, hence my original statement that we don't need ProgPoW now. In a years time that may totally change and it will be far to late.

GPUs allow anyone to support the network. Think of the crypto run-up. Fry's Electronics, Microceneter, online E-tailers were SOLD OUT OF GPUs. Think of that! People were buying GPUs to support the network for token rewards(worth money) How many new miners, people, got interested in crypto because of this? How about friends who saw the rigs and word of mouth spread that you could go out buy a graphics card, built a rig, and earn money? obviously we know the effects because it wasn't sustainable in the remotest. However it's an attest that GPU mineable coins makes it accessible to everyone.

For Ethereum to successfully go POS it cannot hand it network over to ASIC mining companies in the meantime. POS is on an unknown release date/timeframe. I understand Vitalk does not like PoW however that's what currently securing the network. Because of this Ethereum must maintain as much decentralization as possible with GPU mining. This is what ProgPOW does. It gives AMD and Nvidia GPUs the advantage they need over ASICs created by Bitmain or others. It allows me to continue to secure the Ethereum network with my 90 GPUs until full POS switch.

Conclusion
Did it have to be ProgPOW? No, as UBIQ has shown they created there own unique ASIC-resistant algorithm. ProgPOW was given to us by the Ifdefelse team completed. This required no work from the ETH devs at all. It's open source and has been reviewed by the Etheruem Dev team. If they haven't found any issues with it yet, I don't see why we cannot implement it.

An argument can be made that if we do switch we risk security, because we'll lose network hashrate and decrease the cost to attack the network. I have two things to say to that. One since ProgPOW is new, Nicehash has not added it to it's network to rent yet. I wouldn't know how long nicehash would take to it add it, but it gives us a short while to get people on new ETH POW network. Additionally to attack the network, they would need massive coordination from GPU mining farms. Such a thing has never been recorded.

The 51% attacks that have happened recently (BCD/BTG/ZEN) and as of 1/8/18, ETC. These were all ASIC mineable coins. In the case of equihash coins, an ASIC that achieved 50x more efficiency had just came to market. It's not proven, but it leads me to believe a bad actor with early access to ASICs was able to attack those coins. All except ZEN have switched to Zhash algorithm. Even ZCASH/Zelcash has funded ProgPOW development. While I disagree they should do this, because that's entirely the problem too many coins using too many of the same algorithm, in the end it's up to the devs.

TL:DR; ASIC-Resistance is futile and a fallacy. PoS or other solutions are needed but to get there we need to keep PoW as Decentralized as possible this is what ProgPOW does.


submitted by Xazax310 to EtherMining [link] [comments]

What is ProgPoW? Why Ethereum needs it moving forward.

Update: ASIC Manufacture say they can make a ProgPoW ASIC

Disclosure, I'm a avid GPU miner with some 90 Nvidia GPUs running out of my garage. I've been in and out of the mining scene since 2011,2014, and recently 2017. I Hold BTC, ETH, RVN. I directly benefit from them moving to ProgPOW, but not without a good reason. Every-time I've gotten into home GPU mining ASICs comes out BTC, LTC, I've had to give up every time. I refuse to see it happen to another excellent coin.

I've been a proponent of Ethereum following there ASIC resistance stance outlined in the original white-paper. Now that ProgPOW has been given the "Green-light" by Hudson Jameson to move forward with ProgPOW. I really think its time to discuss the Algorithm. What it is, who created it, why Ethereum needs it and dismiss crazy theories such as Nvidia funding development.

Before we start highly suggest everyone watch BitsBeTrippin's video where she breaks down ProgPOW at devcon4.

A Quick breakdown of What is ProgPOW?
ProgPoW is a proof-of-work algorithm designed to close the efficency gap available to specialized ASICs. It utilizes almost all parts of commodity hardware (GPUs), and comes pre-tuned for the most common hardware utilized in the Ethereum network.

From reading the white paper listed on Github the main idea behind ProgPOW is NOT to achieve total ASIC-resistance. The idea is to kill the 50-1000x Efficiency gains from specialized ASIC hardware. Such as what we saw recently with Equihash 200/9 coins where 50x was achieved over GPUs. ProgPOW algorithm uses most of the GPU minus a few parts. It takes the original Eth-Hash algorithm and add more features.
The main elements of the algorithm are:
ProgPOW will Inherit Eth-Hash current DAG size meaning 2GB and 3GB will not be able to mine still. Additionally no advantage is given to Either Nvidia or AMD GPUs
ProgPoW has been designed to be a vendor-neutral proof-of-work, or more specifically, proof-of-GPU. ProgPoW has intentionally avoided using features that only one core architecture has, such as LOP3 on NVIDIA, or indexed register files on AMD.

According to Kristy, she has had direct contact with AMD and Nvidia on testing ProgPOW.
As part of its review process, ProgPoW was submitted to (and reviewed by) both AMD and NVIDIA engineers. The group known as IfDefElse — of which I am a part of — has been actively working with both companies to ensure this effectively closes the efficiency gap that we speak publicly of in our papers and articles
This does not mean one side is favored over the other. She's giving and getting input from the major GPU manufactures in order to support Crypto-mining. Additionally she says "AMD is actively working with us to optimize ProgPoW for their architectures.". Using ProgPOW optimized for GPUs rids us of bowing to Bitmain, innosilicon, halong and there scandalous ways for hardware.

ProgPOW IS NOT the "God-sent savior of all GPUS" Even Kristy understand that complete ASIC-resistance is a fallacy. This will never be achieved. However By working with GPU manufactures and Crypto Dev's we can make a coin where GPUs run along-side with ASICs, but the efficiency gains are diluted. Meaning the time and money invested into an ProgPOW ASIC machine does not make economical sense. Rather just buy the actual GPU.

Quote sources from Kristy's Medium article.

Why does Ethereum need ProgPOW?

I suggest reading Siacoin's good medium article on the subject of ASICs.
It's too much to cover here but in short why we need ProgPOW against current ASICs
At his point in time we actually don't need ProgPOW. However we do need it as time goes on. Early Bitcoin ASICs didn't dominate BTC however as time went on, they became better more efficient than GPUs, and started dominating BTC's network. The same fate happens to any "ASIC-Resistant coin" that decides it's not a big deal (looking at you ZEN). Without a set date on POS Ethereum would have suffered the same fate. As Siacoin Dev states;
We also had loose designs for ethash (Ethereum’s algorithm). Admittedly, ethash was not as easily amenable to ASICs as equihash, but as we’ve seen from products on the market today, you can still do well enough to obsolete GPUs.
What makes ASICs bad? Isn't it better to get Hash/watt ratio? This saves tons of electric. One of PoW biggest faults. I think there is nothing bad about the ASICs hardware. Equihash ASICs achieved 20 1080ti level hashrate at 1/20 of the power. That's impressive. The problem with ASIC hardware is who, where it comes from, and there shady business practices.

  1. "It’s estimated that Monero’s secret ASICs made up more than 50% of the hashrate for almost a full year before discovery, and during that time, nobody noticed." How much of ETH hashrate could be ASICs? We won't know till the fork.
  2. I've heard a lot that ASICs aren't all that big of a deal. Focus on POS. Take in account Siacoins own network hashrate which allowed bitmain/innosilicon ASICs on the network till they forked in favor of their own ASICs after just a year (Siacoins drops 96% network hashrate).
  3. "In the case of Halong’s Decred miner, we saw them “sell out” of an unknown batch size of $10,000 miners. After that, it was observed that more than 50% of the mining rewards were collecting into a single address that was known to be associated with Halong, meaning that they did keep the majority of the hashrate and profits to themselves." GPU manufactures would not and cannot be do the same.
ASICs destroy networks, centralize the pools, and hardware. Leading to them to be controlled by large entity in this case its Chinese companies. Anyone who thinks otherwise is fool. Of course this doesn't happen overnight, hence my original statement that we don't need ProgPoW now. In a years time that may totally change and it will be far to late.

GPUs allow anyone to support the network. Think of the crypto run-up. Fry's Electronics, Microceneter, online E-tailers were SOLD OUT OF GPUs. Think of that! People were buying GPUs to support the network for token rewards(worth money) How many new miners, people, got interested in crypto because of this? How about friends who saw the rigs and word of mouth spread that you could go out buy a graphics card, built a rig, and earn money? obviously we know the effects because it wasn't sustainable in the remotest. However it's an attest that GPU mineable coins makes it accessible to everyone.

For Ethereum to successfully go POS it cannot hand it network over to ASIC mining companies in the meantime. POS is on an unknown release date/timeframe. I understand Vitalk does not like PoW however that's what currently securing the network. Because of this Ethereum must maintain as much decentralization as possible with GPU mining. This is what ProgPOW does. It gives AMD and Nvidia GPUs the advantage they need over ASICs created by Bitmain or others. It allows me to continue to secure the Ethereum network with my 90 GPUs until full POS switch.

Conclusion
Did it have to be ProgPOW? No, as UBIQ has shown they created there own unique ASIC-resistant algorithm. ProgPOW was given to us by the Ifdefelse team completed. This required no work from the ETH devs at all. It's open source and has been reviewed by the Etheruem Dev team. If they haven't found any issues with it yet, I don't see why we cannot implement it.

An argument can be made that if we do switch we risk security, because we'll lose network hashrate and decrease the cost to attack the network. I have two things to say to that. One, since ProgPOW is new, Nicehash has not added it to it's network to rent yet. I wouldn't know how long nicehash would take to it add it, but it gives us a short while to get people on new ETH POW network. Additionally to attack the network, they would need massive coordination from GPU mining farms. Such a thing has never been recorded.

The 51% attacks that have happened recently (BCD/BTG/ZEN) and as of 1/8/18, ETC. These were all ASIC mineable coins. In the case of equihash coins, an ASIC that achieved 50x more efficiency had just came to market. It's not proven, but it leads me to believe a bad actor with early access to ASICs was able to attack those coins. All except ZEN have switched to Zhash algorithm. Even ZCASH/Zelcash has funded ProgPOW development. While I disagree they should do this, because that's entirely the problem too many coins using too many of the same algorithm, in the end it's up to the devs.

TL:DR; ASIC-Resistance is futile and a fallacy. PoS or other solutions are needed but to get there we need to keep PoW as Decentralized as possible this is what ProgPOW does.


Update 10/10/19 See medium article on ProgPoW FAQs.
submitted by Xazax310 to gpumining [link] [comments]

How do I mine Dogecoin?

How do I mine Dogecoin?
Let’s take a lucky guess that you’re here today because you’ve heard a lot about cryptocurrencies and you want to get involved, right? If you’re a community person, Dogecoin mining might be the perfect start for you!
Bitcoin was the first in 2009, and now there are hundreds of cryptocurrencies. These new coins (that operate on their own native blockchain) are called altcoins or alternative coins. One popular altcoin is Dogecoin. It can be bought, sold and traded, just like Bitcoin. It can also be mined!
So, what is Dogecoin mining?
You’ll know what hardware and what software you need to get started. You’ll also know whether or not Dogecoin mining is for you!
So, where would you like to start? The beginning? Great choice. Let’s have a quick look at how Dogecoin got started.
A (Very) Short History of Dogecoin
In 2013, an Australian named Jackson Palmer and an American named Billy Markus became friends. They became friends because they both liked cryptocurrencies. However, they also thought the whole thing was getting too serious so they decided to create their own.
Palmer and Markus wanted their coin to be more fun and more friendly than other crypto coins. They wanted people who wouldn’t normally care about crypto to get involved.
They decided to use a popular meme as their mascot — a Shiba Inu dog.

https://preview.redd.it/rymnyyz1iil31.png?width=303&format=png&auto=webp&s=f138e3fe56eef9c6b0e7f49b84fefc41fb83e5aa
Dogecoin was launched on December 6th, 2013. Since then it has become popular because it’s playful and good-natured. Just like its mascot!
Dogecoin has become well-known for its use in charitable acts and online tipping. In 2014, $50,000 worth of Dogecoin was donated to the Jamaican Bobsled Team so they could go to the Olympics. Dogecoin has also been used to build wells in Kenya. Isn’t that awesome!
Users of social platforms – like Reddit – can use Dogecoin to tip or reward each other for posting good content.
Dogecoin has the 27th largest market cap of any cryptocurrency.
Note: A market cap (or market capitalization) is the total value of all coins on the market.
So, Dogecoin is a popular altcoin, known for being fun, friendly and kind. It’s a coin with a dog on it! You love it already, don’t you?
Next, I want to talk about how mining works…
What is Mining?
To understand mining, you first need to understand how cryptocurrencies work. Cryptocurrencies are peer-to-peer digital currencies. This means that they allow money to be transferred from one person to another without using a bank.
Every cryptocurrency transaction is recorded on a huge digital database called a blockchain. The database is stored across thousands of computers called nodes. Nodes put together groups of new transactions and add them to the blockchain. These groups are called blocks.
Each block of transactions has to be checked by all the nodes on the network before being added to the blockchain. If nodes didn’t check transactions, people could pretend that they have more money than they really do (I know I would!).
Confirming transactions (mining) requires a lot of computer power and electricity so it’s quite expensive.
Blockchains don’t have paid employees like banks, so they offer a reward to users who confirm transactions. The reward for confirming new transactions is new cryptocurrency. The process of being rewarded with new currency for confirming transactions is what we call “mining”!

https://preview.redd.it/rcut2jx3iil31.png?width=598&format=png&auto=webp&s=8d78d41c764f4fe4e6386da4f40a66556a873b87
It is called mining because it’s a bit like digging for gold or diamonds. Instead of digging with a shovel for gold, you’re digging with your computer for crypto coins!
Each cryptocurrency has its own blockchain. Different ways of mining new currency are used by different coins where different rewards are offered.
So, how do you mine Dogecoin? What’s special about Dogecoin mining? Let’s see…
What is Dogecoin Mining?
Dogecoin mining is the process of being rewarded with new Dogecoin for checking transactions on the Dogecoin blockchain. Simple, right? Well no, it’s not quite that simple, nothing ever is!
Mining Dogecoin is like a lottery. To play the lottery you have to do some work. Well, actually your computer (or node) has to do some work! This work involves the confirming and checking of transactions which I talked about in the last section.
Lots of computers work on the same block of transactions at the same time but the only one can win the reward of new coins. The one that earns the new coins is the node that adds the new block of transactions to the old block of transactions. This is completed using complex mathematical equations.
The node that solves the mathematical problem first wins! It can then attach the newly confirmed block of transactions to the rest of the blockchain.
Most cryptocurrency mining happens this way. However, Dogecoin mining differs from other coins in several important areas. These areas are;
  • Algorithm: Each cryptocurrency has a set of rules for mining new currency. These rules are called a mining or hashing algorithm.
  • Block Time: This is the average length of time it takes for a new block of transactions to be checked and added to the blockchain.
  • Difficulty: This is a number that represents how hard it is to mine each new block of currency. You can use the difficulty number to work out how likely you are to win the mining lottery. Mining difficulty can go up or down depending on how many miners there are. The difficulty is also adjusted by the coin’s protocol to make sure that the block time stays the same.
  • Reward: This is the amount of new currency that is awarded to the miner of each new block.
Now, let’s compare how DogeCoin mining works compared to Litecoin and Bitcoin…
Mining Comparison
Bitcoin uses SHA-256 to guide the mining of new currency and the other two use Scrypt. This is an important difference because Scrypt mining needs a lot less power and is a lot quicker than SHA-256. This makes mining easier for miners with less powerful computers. Fans of Litecoin and Dogecoin think that they are fairer than Bitcoin because more people can mine them.
Note: In 2014, Litecoin and Dogecoin merged mining. This means they made it possible to mine both coins in the same process. Dogecoin mining is now linked with Litecoin mining. It’s like two different football teams playing home games in the same stadium!
Mining Dogecoin is a lot faster than mining Litecoin or Bitcoin. The block reward is much higher too!
Don’t get too excited though (sorry!). Dogecoin is still worth a lot less than Bitcoin and Litecoin. A reward of ten thousand Dogecoin is worth less than thirty US Dollars. A reward of 12.5 Bitcoin is currently worth 86,391.63 US Dollars!
However, it’s not as bad as it sounds. Dogecoin mining difficulty is more than one million times less than Bitcoin mining difficulty. This means you are much more likely to win the block reward when you mine Dogecoin.
Now I’ve told you about what Dogecoin mining is and how it works, would you like to give it a try?
Let’s see what you need to do to become a Dogecoin miner…
How to Mine Dogecoin
There are two ways to mine Dogecoin, solo (by yourself) or in a Dogecoin mining pool.
Note: A Dogecoin pool is a group of users who share their computing power to increase the odds of winning the race to confirm transactions. When one of the nodes in a pool confirms a transaction, it divides the reward between the users of the pool equally.
Dogecoin Mining: Solo vs Pool
When you mine as a part of a Dogecoin pool, you have to pay fees. Also, when the pool mines a block you will only receive a small portion of the total reward. However, pools mine blocks much more often than solo miners. So, your chance of earning a reward (even though it is shared) is increased. This can provide you with a steady new supply of Dogecoin.
If you choose to mine solo then you risk waiting a long time to confirm a transaction because there is a lot of competition. It could be weeks or even months before you mine your first block! However, when you do win, the whole reward will be yours. You won’t have to share it or pay any fees.
As a beginner, I would recommend joining a Dogecoin pool. This way you won’t have to wait as long to mine your first block of new currency. You’ll also feel like you’re part of the community and that’s what Dogecoin is all about!
What You Need To Start Mining Dogecoin
Before you start Dogecoin mining, you’ll need a few basics. They are;
  • A PC with either Windows, OS X or Linux operating system.
  • An internet connection
  • A Shiba Inu puppy (just kidding!)
You’ll also need somewhere to keep the Dogecoin you mine. Go to Dogecoin’s homepage and download a wallet.
Note: A wallet is like an email account. It has a public address for sending/receiving Dogecoin and a private key to access them. Your private keys are like your email’s password. Private keys are very important and need to be kept completely secure.
There are two different types; a light wallet and a full wallet. To mine Dogecoin, you’ll need the full wallet. It’s called Dogecoin Core.
Now that you’ve got a wallet, you need some software and hardware.
Dogecoin Mining Hardware
You can mine Dogecoin with;
  • Your PC’s CPU: The CPU in your PC is probably powerful enough to mine Dogecoin. However, it is not recommended. Mining can cause less powerful computers to overheat which causes damage.
  • A GPU: GPUs (or graphics cards) are used to improve computer graphics but they can also be used to mine Dogecoin. There are plenty of GPUs to choose from but here are a few to get you started;SAPPHIRE Pulse Radeon RX 580 ($426.98)Nvidia GeForce GTX ($579.99)ASUS RX Vega 64 ($944.90)
  • A Scrypt ASIC Miner: This is a piece of hardware designed to do one job only. Scrypt ASIC miners are programmed to mine scrypt based currencies like Litecoin and Dogecoin. ASIC miners are very powerful. They are also very expensive, very loud and can get very hot! Here’s a few for you to check out;Innosilicon A2 Terminator ($760)Bitmain Antminer L3 ($1,649)BW L21 Scrypt Miner ($7,700)
Dogecoin Mining Software
Whether you’re mining with an ASIC, a GPU or a CPU, you’ll need some software to go with it. You should try to use the software that works best with the hardware you’re using. Here’s a short list of the best free software for each choice of mining hardware;
  • CPU: If you just want to give mining a quick try, using your computer’s CPU will work fine. The only software I would recommend for mining using a CPU only is CPU miner which you can download for free here.
  • GPU: If you mine with a GPU there are more software options. Here are a few to check out;CudaMiner– Works best with Nvidia products.CGminer– Works with most GPU hardware.EasyMiner– User-friendly, so it’s good for beginners.
  • Scrypt ASIC miner:MultiMiner– Great for mining scrypt based currencies like Litecoin and Dogecoin. It can also be used to mine SHA-256 currencies like Bitcoin.CGminer and EasyMiner can also be used with ASIC miners.
Recommendations
You’re a beginner, so keep it simple! When you first start mining Dogecoin I would recommend using a GPU like the Radeon RX 580 with EasyMiner software. Then I would recommend joining a Dogecoin mining pool. The best pools to join are multi-currency pools like Multipool or AikaPool.
If you want to mine Dogecoin but don’t want to invest in all the tech, there is one other option…
Dogecoin Cloud Mining
Cloud mining is mining without mining! Put simply, you rent computer power from a huge data center for a monthly or yearly fee. The Dogecoin is mined at the center and then your share is sent to you.
All you need to cloud mine Dogecoin is a Dogecoin wallet. Then choose a cloud mining pool to join. Eobot, Nice Hash and Genesis Mining all offer Scrypt-based cloud mining for a monthly fee.
There are pros and cons to Dogecoin cloud mining;
The Pros
  • It’s cheaper than setting up your own mining operation. There’s also no hot, noisy hardware lying around the house!
  • As a beginner, there isn’t a lot of technical stuff to think about.
  • You get a steady supply of new currency every month.
The Cons
  • Cloud mining pools don’t share much information about themselves and how they work. It can be hard to work out if a cloud mining contract is a good value for money.
  • You are only renting computer power. If the price of Dogecoin goes down, you will still have to pay the same amount for something that is worthless.
  • Dogecoin pools have fixed contracts. The world of crypto can change very quickly. You could be stuck with an unprofitable contract for two years!
  • It’s no fun letting someone else do the mining for you!
Now you know about all the different ways to mine Dogecoin we can ask the big question, can you make tons of money mining Dogecoin?
So, Is Dogecoin Mining Profitable?
The short answer is, not really. Dogecoin mining is not going to make you a crypto billionaire overnight. One Dogecoin is worth 0.002777 US Dollars. If you choose to mine Dogecoin solo, it will be difficult to make a profit. You will probably spend more money on electricity and hardware than you will make from Dogecoin mining. Even if you choose a Dogecoin pool or a cloud pool your profits will be small.
However, if you think I am telling you to not mine Dogecoin, then you’re WRONG! Of course, I think you should mine Dogecoin!
But why? Seriously…
Well, you should mine Dogecoin because it’s fun and you want to be a part of the Dogecoin family. Cryptocurrency is going to change the world and you want to be part of that change, right? Mining Dogecoin is a great way to get involved.
Dogecoin is the coin that puts a smile on people’s faces. By mining Dogecoin you’ll be supporting all the good work its community does. You’ll learn about mining from the friendliest gang in crypto. And who knows? In a few years, the Dogecoin you mine now could be worth thousands or even millions! In 2010, Bitcoin was worthless. Think about that!
Only you can choose whether to mine Dogecoin or not. You now know everything you need to know to make your choice. The future is here. So, what are you going to do?
submitted by alifkhalil469 to BtcNewz [link] [comments]

I had like 3 friends ask me how to build a PC in the past week so I made this to help them.

(Reddit Edit: Help my improve the document with productive constructive comments on what I got wrong or messed up! Im only human lol
Also a lot of this is supposed to be kinda humorous. I didn't think I had to say that but, hey, its the internet.
I appreciate the positive and productive comments! )
Beginners basic guide to building your own PC as of early 2018
(EDIT: Sorry for being a MSI/Corsair Fanboy)
Heres a collection of thoughts to consider when building your own personal PC
As always Id personally use PCPartPicker.com to configure your parts and for further thoughts on compatibility.
First off building a computer is 100% based around what you plan to use the computer for.
Here are a few uses and generic ideas of what to go for. Audio Editing: Lots of small tasks that need to be completed quickly without lag. - Fast Processor( >4GHZ) - Fast RAM (MHZ) -At least 16 gigs! - Fast Storage, SSD manditorily - M.2 or PCI for best performance. - Shitty Graphics card, graphics card there only to keep the cpu from doing other tasks when working. - Can be a few generations or years old. - Many screens for lots of plug in windows to be open Video Editing: Lots of large to render and files to read. - Multi core processor the more the merrier - SSD for fast read/write of large video files. - Insane graphics card, AMD graphics cards are debatibly better but the nvidia Quadro series are specific for video rendering. Gaming: No more than 4 cores intense graphics card - 92% of games are not coded for more than 4 cores so why spend the extra money for it. - SSD for quick load screens - Nvidia cards, 10 series, the higher the number the better. Titan cards for MAXIMUM OVERDRIVE! Coding: quick processor for lots of small tasks. Ergonomic peripherials? - Dear god please dont use a mechanical keyboard so that your coworkers dont kill you. Home office: Everything can be a few gens behind so you can get the best power per dollar spent. - Sorry that Gateway doesnt exist anymore. I guess try Dell... 
Parts (Expensive Legos)
CPU (tells things to go places and outputs data) Basically three main routes to go for: Intel, AMD, or ASIC. Intel - Gaming, Data center, Hackintosh Pros: Cooler, Faster speed (GHZ), short small tasks faster Cons: $$$$, less cores AMD - Gaming, Personal Computing, Large task processing Pros: Lots of cores, better price per performance, faster processing of large tasks Cons: Hot chips, large chips?, compatibility issues with MacOS. ASIC - "Application-specific integrated circuit" Pros: Does the task that they are made to do insanely efficently, great for mining. Cons: Literally does nothing else. Holy hell these are expensive, very hot (fans will get loud) CPU Cooler (Im a big fan) Most come with an in box cooler that are ok but please buy aftermarket. In Box - the free shitty cooler that comes with the processor. Pros: Free. Cons: Ugly, makes chip run hot, hard to clean Air cooler - oldest type of cooler but new designs are highly efficent. Pros: Only cooler that has the possibility of being 100% quiet, most likely cheaper Cons: large, if cooler isnt large enough for the chips thermal output the fans will be loud. Liquid - Custom pipes are beautiful, AIO is easy to install and offers similare performance. Pros: Looks cool, great temperatures, "quiet" Cons: Water pump has possibility of being loud, possible spills Phase Change - uses the technology of refridgerators to cool the chip Pros: Can overclock until the chip breaks. (whats colder than cold? ICE COLD!) Cons: Loud (compressor noise), Large pipes, just why.... Motherboard (the convienacnce store of computer parts) Really just about what type of I/O you want. - MAKE SURE FORM FACTOR FITS YOUR CASE! (or vice versa) - Look for PCI lanes for expansion. - How many graphic cards do you have? - PCI based interfaces? - PCI SSD? - PCI DAC? - PCI WIFI? - USbs? Network? Audio? - How many lanes of RAM? - DOES IT FIT YOUR PROCESSOR!?! (really tho) - M.2? - How many sata interaces? Good Brands: MSI, ASUS, Gigabyte Bad Brands: AS(s)Rock, Dell Memory (Dory) - The more the merrier - No less than 8gb for a functional windows machine (16 gb to never have a problem) - Use all the lanes your computer has to offer! the more lanes to access the faster the data can travel! -Imagine drinking a milkshake. If the straw is wider you can drink more of the milkshake than a skinny straw. - Faster MHZ for faster data access but give minimal performance differances - Please get ram with heat spreadders unles youre building a server with high airflow. - Make sure the type (DDR3 or DDR4) of RAM matches what your processomotherboard call for. Good Brands: Corsair, G.Skill, Ballistix Storage (Grandpa that remembers everythign about how things used to be but takes forever to learn a new tasK) Speed or massive storage? slower is cheaper. Golden ratio of speed/storage/price is 250-500 gb SSD and a 1+ tb disk drive. *Max speeds listed are for a single drive not RAID* Hard Disk Drives (HDD) - Cheapest and slowest - read/write speeds of < 0.5gb/s - 7200+ RPM or GTFO - Higher Speed drives can access data faster. - Do not move while powered up. physical parts will break. - Larger Cahche = faster Read/Write Speeds Pros: Cheap, Holds massive amounts of data Cons: Slower than molasses in a frezer Reputible Brands: Seagate, WD Solid State Drives (SSD) - necessity for quick boots and fast load screens (can only be re-written to so many times) - SATA based (2.5 inch)- Read/Write speeds capped @ 6 gb/s Pros: Most economical, form factor fits with old computers, Cons: "Slow" compared to other ssd's (but stil 12 times faster than a HDD) - M.2 based - Read/Write speeds capped @ 10 gb/s Pros: Size of a sick of gum! High End but not too expensive to be out of reach. Cons: Expensive for any size over 500 gb - PCI based - Read/Write speeds capped @ 20 gb/s for PCI3, x4 Pros: HOLY BANDWIDTH BATMAN! Faster than that little creepy ghost thats always in the corner of you eye Cons: You might have to take out a loan to buy one. *takes up a x4 PCI Lane* Reputible Brands: Samsung! Corsair, Plextor, Intel, Kingston, Crucial Video Card (that one kid that has thick glasses and is really good at math) - A regular old PCI card that handles all of the video rendering and output for your computer. - ASIC PCI cards. - The PCBs and chips are patented by two main companies but the differances come from line up and varying manufacturer cooling devices. - The more memory the better -NVIDIA (Team Green) Great for gaming, has specific card series for intensive rendering. Lazy driver updates. - Gaming - 900 series - Cheap - Low performance - Can play any video game made befrore 2010 on max settings - 1000 (ten) series - Expensive (thanks bitcoin miners...) - Great for VR! - Video Rendering -Quadro Series - Gaming and Rendering - Titan X - Maxwell based chip same as 900 series cards - Titan XP - Pascal based chip same as 10 series cards -AMD (Team Red) Underdog does the same thing but slighly worse and cheaper. (except video rendering) - Gaming - RX 400 series - Cheap - Hot - RX 500 series - Cheap - Ok at VR and deacent gaming frame rates. - Not bad but not particularly great either. - Video Rendering - Fire Pro series - Gaming and Rendering - Vega series -Good luck finding one to buy lmao Case (Fancy clothing for your parts!) - Similar to human clothing you want it to do a few main things really well with compromises for each extreme. - Durability - Steel - Incredibly durable - Creates Farady cage for components - Heavy af - Magnets, just magnets.... - Rust over time - Aluminium - Light - East to bend for modding or "physical maintenance" - Less likely to rust - Huzzah for Farady cages! - Plastic - Just dont - no electrical Ground - no faraday cage - Light AF! - Breath (Airflow) - positive internal airflow! - larger fans push the same amount of air with less speed/noise - Looks - Window? - RGB - Cool Paint? - Fit all your parts - graphics card length/ clearacne - support for liquid cooling raiators? - How many spots for HDD/SSDs - Motherboard format - Cable management! Power Supply (FIGHT MILK) - Rule of thumb: BUy Powersupply that outputs 1.5 times the wattage that you need. - You can walk further than you can you can run. - The PSU can casually output 50-75% power for much longer than at 90-100% (without failure) - If you never demand enough wattage for it to get hot the fan doesnt have to turn on therefore making it quieter. - Modular means you can remove/replace the cables from the PSU. Reputible Brands: Corsair, EVGA Optical Drive (motorized cup holder) - You can download most things today so I'd suggest against it unless you really NEED to watch/write DVD's/CD's Operating System (software that makes everything work) Windows (Always Updates) - Compatible with just about everything - Easy to learn to code on! - POS inital browser - Likely to get virus's Linux (Penguins are cute) - Unique - takes less resources to run - Barebones - Incredibly personalizable! - Compatibility issues with just about everything MacOS (Linux but more annoying) - It is legal! - Great for art and your grandma that doenst know how to use computers! - User friendly - Compatibility issues with various hardware - Confusing/Limiting coding structure Peripherials (cables everywhere!) - Keyboard (higer Polling rate is better) - Mechanical (key is pressed at an exact stroke length every time - Mouse (Higher Polling rate is better) - more buttons = better? - DPI (Dots Per Inch) - In theory, if a mouse has 1600 DPI, then, if you move your mouse one inch (2.54 cm), the mouse cursor will move 1600 pixels - Higher DPI the faster your cursor is able to be moved. - Monitor - In theory the human eye cant see faster than 60 frames per second. - Keep in mind Pixel ratio! - 4k screen that is 22inches will have more pixels in a square inch than a 4k screen that is 28 inches. - Interface? - DVI (Analog) - thumbscrews..... - can do two monitors with one port! - support for 4k - VGA (Analog) - thumbscrews... - max resolution is 1440p - Display Port (digital) - nice button clip - supports 4k - HDMI (Digital) - 1.2 or higer supports 4k - DAC/Speakers/Headphones - Dont even get me started - Microphone - Dont get me started PT.2 Other (other) - UPS (uninterruptible power supply) Just a battery that allows your computer to have some time if the power ever goes out so that you have time to save your work. - Cable Organization materials! - Zipties - velcro - LED LIGHTING! - Manditory - Extra/Better fans - More pressure, less woosh - IFIXIT Pro Tech Toolkit - becasue who buys just one torx wrench. - Cute kitten mousepad - Yes, it has to be a cat. Dont argue 
This is a very general entry into building computers and what you should buy/look for. If you have any questions/comments send me an e-mail!
-Zac Holley-
submitted by Zac_Attack13 to pcmasterrace [link] [comments]

Console gaming is hardly different from PC gaming, and much of what people say about PC gaming to put it above console gaming is often wrong.

I’m not sure about you, but for the past few years, I’ve been hearing people go on and on about PCs "superiority" to the console market. People cite various reasons why they believe gaming on a PC is “objectively” better than console gaming, often for reasons related to power, costs, ease-of-use, and freedom.
…Only problem: much of what they say is wrong.
There are many misconceptions being thrown about PC gaming vs Console gaming, that I believe need to be addressed. This isn’t about “PC gamers being wrong,” or “consoles being the best,” absolutely not. I just want to cut through some of the stuff people use to put down console gaming, and show that console gaming is incredibly similar to PC gaming. I mean, yes, this is someone who mainly games on console, but I also am getting a new PC that I will game on as well, not to mention the 30 PC games I already own and play. I’m not particularly partial to one over the other.
Now I will mainly be focusing on the PlayStation side of the consoles, because I know it best, but much of what I say will apply to Xbox as well. Just because I don’t point out many specific Xbox examples, doesn’t mean that they aren’t out there.

“PCs can use TVs and monitors.”

This one isn’t so much of a misconception as it is the implication of one, and overall just… confusing. This is in some articles and the pcmasterrace “why choose a PC” section, where they’re practically implying that consoles can’t do this. I mean, yes, as long as the ports of your PC match up with your screen(s) inputs, you could plug a PC into either… but you could do the same with a console, again, as long as the ports match up.
I’m guessing the idea here is that gaming monitors often use Displayport, as do most dedicated GPUs, and consoles are generally restricted to HDMI… But even so, monitors often have HDMI ports. In fact, PC Magazine has just released their list of the best gaming monitors of 2017, and every single one of them has an HDMI port. A PS4 can be plugged into these just as easily as a GTX 1080.
I mean, even if the monitoTV doesn’t have HDMI or AV to connect with your console, just use an adaptor. If you have a PC with ports that doesn’t match your monitoTV… use an adapter. I don’t know what the point of this argument is, but it’s made a worrying amount of times.

“On PC, you have a wide range of controller options, but on console you’re stuck with the standard controller."

Are you on PlayStation and wish you could use a specific type of controller that suits your favorite kind of gameplay? Despite what some may believe, you have just as many options as PC.
Want to play fighting games with a classic arcade-style board, featuring the buttons and joystick? Here you go!
Want to get serious about racing and get something more accurate and immersive than a controller? Got you covered.
Absolutely crazy about flying games and, like the racers, want something better than a controller? Enjoy!
Want Wii-style motion controls? Been around since the PS3. If you prefer the form factor of the Xbox One controller but you own a PS4, Hori’s got you covered. And of course, if keyboard and mouse it what keeps you on PC, there’s a PlayStation compatible solution for that. Want to use the keyboard and mouse that you already own? Where there’s a will, there’s a way.
Of course, these aren’t isolated examples, there are plenty of options for each of these kind of controllers. You don’t have to be on PC to enjoy alternate controllers.

“On PC you could use Steam Link to play anywhere in your house and share games with others.”

PS4 Remote play app on PC/Mac, PSTV, and PS Vita.
PS Family Sharing.
Using the same PSN account on multiple PS4s/Xbox Ones and PS3s/360s, or using multiple accounts on the same console.
In fact, if multiple users are on the same PS4, only one has to buy the game for both users to play it on that one PS4. On top of that, only one of them has to have PS Plus for both to play online (if the one with PS Plus registers the PS4 as their main system).
PS4 Share Play; if two people on separate PS4s want to play a game together that only one of them owns, they can join a Party and the owner of the game can have their friend play with them in the game.
Need I say more?

“Gaming is more expensive on console.”

Part one, the Software
This is one that I find… genuinely surprising. There’s been a few times I’ve mentioned that part of the reason I chose a PS4 is for budget gaming, only to told that “games are cheaper on Steam.” To be fair, there are a few games on PSN/XBL that are more expensive than they are on Steam, so I can see how someone could believe this… but apparently they forgot about disks.
Dirt Rally, a hardcore racing sim game that’s… still $60 on all 3 platforms digitally… even though its successor is out.
So does this mean you have to pay full retail for this racing experience? Nope, because disk prices.
Just Cause 3, an insane open-world experience that could essentially be summed up as “break stuff, screw physics.” And it’s a good example of where the Steam price is lower than PSN and XBL:
Not by much, but still cheaper on Steam, so cheaper on PC… Until you look at the disk prices.
See my point? Often times the game is cheaper on console because of the disk alternative that’s available for practically every console-available game. Even when the game is brand new.
Dirt 4 - Remember that Dirt Rally successor I mentioned?
Yes, you could either buy this relatively new game digitally for $60, or just pick up the disk for a discounted price. And again, this is for a game that came out 2 months ago, and even it’s predecessor’s digital cost is locked at $60. Of course, I’m not going to ignore the fact that Dirt 4 is currently (as of writing this) discounted on Steam, but on PSN it also happens to be discounted for about the same amount.
Part 2: the Subscription
Now… let’s not ignore the elephant in the room: PS Plus and Xbox Gold. Now these would be ignorable, if they weren’t required for online play (on the PlayStation side, it’s only required for PS4, but still). So yes, it’s still something that will be included in the cost of your PS4 or Xbox One/360, assuming you play online. Bummer, right?
Here’s the thing, although that’s the case, although you have to factor in this $60 cost with your console, you can make it balance out, at worst, and make it work out for you as a budget gamer, at best. As nice as it would be to not have to deal with the price if you don’t want to, it’s not like it’s a problem if you use it correctly.
Imagine going to a new restaurant. This restaurant has some meals that you can’t get anywhere else, and fair prices compared to competitors. Only problem: you have to pay a membership fee to have the sides. Now you can have the main course, sit down and enjoy your steak or pasta, but if you want to have a side to have a full meal, you have to pay an annual fee.
Sounds shitty, right? But here’s the thing: not only does this membership allow you to have sides with your meal, but it also allows you to eat two meals for free every month, and also gives you exclusive discounts for other meals, drinks, and desserts.
Let’s look at PS Plus for a minute: for $60 per year, you get:
  • 2 free PS4 games, every month
  • 2 free PS3 games, every month
  • 1 PS4/PS3 and Vita compatible game, and 1 Vita-only game, every month
  • Exclusive/Extended discounts, especially during the weekly/seasonal sales (though you don’t need PS Plus to get sales, PS Plus members get to enjoy the best sales)
  • access to online multiplayer
So yes, you’re paying extra because of that membership, but what you get with that deal pays for it and then some. In fact, let’s ignore the discounts for a minute: you get 24 free PS4 games, 24 free PS3 games, and 12 Vita only + 12 Vita compatible games, up to 72 free games every year. Even if you only one of these consoles, that’s still 24 free games a year. Sure, maybe you get games for the month that you don’t like, then just wait until next month.
In fact, let’s look at Just Cause 3 again. It was free for PS Plus members in August, which is a pretty big deal. Why is this significant? Because it’s, again, a $60 digital game. That means with this one download, you’ve balanced out your $60 annual fee. Meaning? Every free game after that is money saved, every discount after that is money saved. And this is a trend: every year, PS Plus will release a game that balances out the entire service cost, then another 23 more that will only add icing to that budget cake. Though, you could just count games as paying off PS Plus until you hit $60 in savings, but still.
All in all, PS Plus, and Xbox Gold which offers similar options, saves you money. On top of that, again, you don't need to have these to get discounts, but with these memberships, you get more discounts.
Now, I’ve seen a few Steam games go up for free for a week, but what about being free for an entire month? Not to mention that; even if you want to talk about Steam Summer Sales, what about the PSN summer sale, or again, disc sale discounts? Now a lot of research and math would be needed to see if every console gamer would save money compared to every Steam gamer for the same games, but at the very least? The costs will balance out, at worst.
Part 3, the Systems
  • Xbox and PS2: $299
  • Xbox 360 and PS3: $299 and $499, respectively
  • Xbox One and PS4: $499 and $399, respectively.
Rounded up a few dollars, that’s $1,000 - $1,300 in day-one consoles, just to keep up with the games! Crazy right? So called budget systems, such a rip-off.
Well, keep in mind that the generations here aren’t short.
The 6th generation, from the launch of the PS2 to the launch of the next generation consoles, lasted 5 years, 6 years based on the launch of the PS3 (though you could say it was 9 or 14, since the Xbox wasn’t discontinued until 2009, and the PS2 was supported all the way to 2014, a year after the PS4 was released). The 7th gen lasted 7 - 8 years, again depending on whether you count the launch of the Xbox 360 to PS3. The 8th gen so far has lasted 4 years. That’s 17 years that the console money is spread over. If you had a Netflix subscription for it’s original $8 monthly plan for that amount of time, that would be over $1,600 total.
And let’s be fair here, just like you could upgrade your PC hardware whenever you wanted, you didn’t have to get a console from launch. Let’s look at PlayStation again for example: In 2002, only two years after its release, the PS2 retail price was cut from $300 to $200. The PS3 Slim, released 3 years after the original, was $300, $100-$200 lower than the retail cost. The PS4? You could’ve either gotten the Uncharted bundle for $350, or one of the PS4 Slim bundles for $250. This all brings it down to $750 - $850, which again, is spread over a decade and a half. This isn’t even counting used consoles, sales, or the further price cuts that I didn’t mention.
Even if that still sounds like a lot of money to you, even if you’re laughing at the thought of buying new systems every several years, because your PC “is never obsolete,” tell me: how many parts have you changed out in your PC over the years? How many GPUs have you been through? CPUs? Motherboards? RAM sticks, monitors, keyboards, mice, CPU coolers, hard drives— that adds up. You don’t need to replace your entire system to spend a lot of money on hardware.
Even if you weren’t upgrading for the sake of upgrading, I’d be amazed if the hardware you’ve been pushing by gaming would last for about 1/3 of that 17 year period. Computer parts aren’t designed to last forever, and really won’t when you’re pushing them with intensive gaming for hours upon hours. Generally speaking, your components might last you 6-8 years, if you’ve got the high-end stuff. But let’s assume you bought a system 17 years ago that was a beast for it’s time, something so powerful, that even if it’s parts have degraded over time, it’s still going strong. Problem is: you will have to upgrade something eventually.
Even if you’ve managed to get this far into the gaming realm with the same 17 year old hardware, I’m betting you didn’t do it with a 17 year Operating System. How much did Windows 7 cost you? Or 8.1? Or 10? Oh, and don’t think you can skirt the cost by getting a pre-built system, the cost of Windows is embedded into the cost of the machine (why else would Microsoft allow their OS to go on so many machines).
Sure, Windows 10 was a free upgrade for a year, but that’s only half of it’s lifetime— You can’t get it for free now, and not for the past year. On top of that, the free period was an upgrade; you had to pay for 7 or 8 first anyway.
Point is, as much as one would like to say that they didn’t need to buy a new system every so often for the sake of gaming, that doesn’t mean they haven’t been paying for hardware, and even if they’ve only been PC gaming recently, you’ll be spending money on hardware soon enough.

“PC is leading the VR—“

Let me stop you right there.
If you add together the total number of Oculus Rifts and HTC Vives sold to this day, and threw in another 100,000 just for the sake of it, that number would still be under the number of PSVR headsets sold.
Why could this possibly be? Well, for a simple reason: affordability. The systems needed to run the PC headsets costs $800+, and the headsets are $500 - $600, when discounted. PSVR on the other hand costs $450 for the full bundle (headset, camera, and move controllers, with a demo disc thrown in), and can be played on either a $250 - $300 console, or a $400 console, the latter recommended. Even if you want to say that the Vive and Rift are more refined, a full PSVR set, system and all, could cost just over $100 more than a Vive headset alone.
If anything, PC isn’t leading the VR gaming market, the PS4 is. It’s the system bringing VR to the most consumers, showing them what the future of gaming could look like. Not to mention that as the PlayStation line grows more powerful (4.2 TFLOP PS4 Pro, 10 TFLOP “PS5…”), it won’t be long until the PlayStation line can use the same VR games as PC.
Either way, this shows that there is a console equivalent to the PC VR options. Sure, there are some games you'd only be able to play on PC, but there are also some games you'd only be able to play on PSVR.
…Though to be fair, if we’re talking about VR in general, these headsets don’t even hold a candle to, surprisingly, Gear VR.

“If it wasn’t for consoles holding devs back, then they would be able to make higher quality games.”

This one is based on the idea that because of how “low spec” consoles are, that when a developer has to take them in mind, then they can’t design the game to be nearly as good as it would be otherwise. I mean, have you ever seen the minimum specs for games on Steam?
GTA V
  • CPU: Intel Core 2 Quad CPU Q6600 @ 2.40GHz (4 CPUs) / AMD Phenom 9850 Quad-Core Processor (4 CPUs) @ 2.5GHz
  • Memory: 4 GB RAM
  • GPU: NVIDIA 9800 GT 1GB / AMD HD 4870 1GB (DX 10, 10.1, 11)
Just Cause 3
  • CPU: Intel Core i5-2500k, 3.3GHz / AMD Phenom II X6 1075T 3GHz
  • Memory: 8 GB RAM
  • GPU: NVIDIA GeForce GTX 670 (2GB) / AMD Radeon HD 7870 (2GB)
Fallout 4
  • CPU: Intel Core i5-2300 2.8 GHz/AMD Phenom II X4 945 3.0 GHz or equivalent
  • Memory: 8 GB RAM
  • GPU: NVIDIA GTX 550 Ti 2GB/AMD Radeon HD 7870 2GB or equivalent
Overwatch
  • CPU: Intel Core i3 or AMD Phenom™ X3 8650
  • Memory: 4 GB RAM
  • GPU: NVIDIA® GeForce® GTX 460, ATI Radeon™ HD 4850, or Intel® HD Graphics 4400
Witcher 3
  • Processor: Intel CPU Core i5-2500K 3.3GHz / AMD CPU Phenom II X4 940
  • Memory: 6 GB RAM
  • Graphics: Nvidia GPU GeForce GTX 660 / AMD GPU Radeon HD 7870
Actually, bump up all the memory requirements to 8 GBs, and those are some decent specs, relatively speaking. And keep in mind these are the minimum specs to even open the games. It’s almost as if the devs didn’t worry about console specs when making a PC version of the game, because this version of the game isn’t on console. Or maybe even that the consoles aren’t holding the games back that much because they’re not that weak. Just a hypothesis.
But I mean, the devs are still ooobviously having to take weak consoles into mind right? They could make their games sooo much more powerful if they were PC only, right? Right?
No. Not even close.
iRacing
  • CPU: Intel Core i3, i5, i7 or better or AMD Bulldozer or better
  • Memory: 8 GB RAM
  • GPU: NVidia GeForce 2xx series or better, 1GB+ dedicated video memory / AMD 5xxx series or better, 1GB+ dedicated video memory
Playerunknown’s Battlegrounds
  • CPU: Intel Core i3-4340 / AMD FX-6300
  • Memory: 6 GB RAM
  • GPU: nVidia GeForce GTX 660 2GB / AMD Radeon HD 7850 2GB
These are PC only games. That’s right, no consoles to hold them back, they don’t have to worry about whether an Xbox One could handle it. Yet, they don’t require anything more than the Multiplatform games.
Subnautica
  • CPU: Intel Haswell 2 cores / 4 threads @ 2.5Ghz or equivalent
  • Memory: 4GB
  • GPU: Intel HD 4600 or equivalent - This includes most GPUs scoring greater than 950pts in the 3DMark Fire Strike benchmark
Rust
  • CPU: 2 ghz
  • Memory: 8 GB RAM
  • DirectX: Version 11 (they don’t even list a GPU)
So what’s the deal? Theoretically, if developers don’t have to worry about console specs, then why aren’t they going all-out and making games that no console could even dream of supporting?
Low-end PCs.
What, did you think people only game on Steam if they spent at least $500 on gaming hardware? Not all PC gamers have gaming-PC specs, and if devs close their games out to players who don’t have the strongest of PCs, then they’d be losing out on a pretty sizable chunk of their potential buyers.
Saying “devs having to deal with consoles is holding gaming back” is like saying “racing teams having to deal with Ford is holding GT racing back.” A: racing teams don’t have to deal with Ford if they don’t want to, which is probably why many of them don’t, and B: even though Ford doesn’t make the fastest cars overall, they still manage to make cars that are awesome on their own, they don’t even need to be compared to anything else to know that they make good cars.
I want to go back to that previous point though, developers having to deal with low-end PCs, because it’s integral to the next point:

“PCs are more powerful, gaming on PC provides a better experience.”

This one isn’t so much of a misconception as it is… misleading.
Did you know that according to the Steam Hardware & Software Survey (July 2017) , the percentage of Steam gamers who use a GPU that's less powerful than that of a PS4 Slim’s GPU is well over 50%? Things get dismal when compared to the PS4 Pro (Or Xbox One X). On top of that, the percentage of PC gamers who own a Nvidia 10 series card is about 20% (about 15% for the 1060, 1080 and 1070 owners).
Now to be fair, the large majority of gamers have CPUs with considerably high clock speeds, which is the main factor in CPU gaming performance. But, the number of Steam gamers with as much RAM or more than a PS4 or Xbox One is less than 50%, which can really bottleneck what those CPUs can handle.
These numbers are hardly better than they were in 2013, all things considered. Sure, a PS3/360 weeps in the face of even a $400 PC, but in this day in age, consoles have definitely caught up.
Sure, we could mention the fact that even 1% of Steam accounts represents over 1 million accounts, but that doesn’t really matter compared to the 10s of millions of 8th gen consoles sold; looking at it that way, sure the number of Nvidia 10 series owners is over 20 million, but that ignores the fact that there are over 5 times more 8th gen consoles sold than that.
Basically, even though PCs run on a spectrum, saying they're more powerful “on average” is actually wrong. Sure, they have the potential for being more powerful, but most of the time, people aren’t willing to pay the premium to reach those extra bits of performance.
Now why is this important? What matters are the people who spent the premium cost for premium parts, right? Because of the previous point: PCs don’t have some ubiquitous quality over the consoles, developers will always have to keep low-end PCs in mind, because not even half of all PC players can afford the good stuff, and you have to look at the top quarter of Steam players before you get to PS4-Pro-level specs. If every Steam player were to get a PS4 Pro, it would be an upgrade for over 60% of them, and 70% of them would be getting an upgrade with the Xbox One X.
Sure, you could still make the argument that when you pay more for PC parts, you get a better experience than you could with a console. We can argue all day about budget PCs, but a console can’t match up to a $1,000 PC build. It’s the same as paying more for car parts, in the end you get a better car. However, there is a certain problem with that…

“You pay a little more for a PC, you get much more quality.”

The idea here is that the more you pay for PC parts, the performance increases at a faster rate than the price does. Problem: that’s not how technology works. Paying twice as much doesn’t get you twice the quality the majority of the time.
For example, let’s look at graphics cards, specifically the GeForce 10 series cards, starting with the GTX 1050.
  • 1.8 TFLOP
  • 1.35 GHz base clock
  • 2 GB VRAM
  • $110
This is our reference, our basis of comparison. Any percentages will be based on the 1050’s specs.
Now let’s look at the GTX 1050 Ti, the 1050’s older brother.
  • 2.1 TFLOP
  • 1.29 GHz base clock
  • 4 GB VRAM
  • $140 retail
This is pretty good. You only increase the price by about 27%, and you get an 11% increase in floating point speed and a 100% increase (double) in VRAM. Sure you get a slightly lower base clock, but the rest definitely makes up for it. In fact, according to GPU boss, the Ti managed 66 fps, or a 22% increase in frame rate for Battlefield 4, and a 54% increase in mHash/second in bitcoin mining. The cost increase is worth it, for the most part.
But let’s get to the real meat of it; what happens when we double our budget? Surely we should see a massive increase performance, I bet some of you are willing to bet that twice the cost means more than twice the performance.
The closest price comparison for double the cost is the GTX 1060 (3 GB), so let’s get a look at that.
  • 3.0 TFLOP
  • 1.5 GHz base clock
  • 3 GB VRAM
  • $200 retail
Well… not substantial, I’d say. About a 50% increase in floating point speed, an 11% increase in base clock speed, and a 1GB decrease in VRAM. For [almost] doubling the price, you don’t get much.
Well surely raw specs don’t tell the full story, right? Well, let’s look at some real wold comparisons. Once again, according to GPU Boss, there’s a 138% increase in hashes/second for bitcoin mining, and at 99 fps, an 83% frame rate increase in Battlefield 4. Well, then, raw specs does not tell the whole story!
Here’s another one, the 1060’s big brother… or, well, slightly-more-developed twin.
  • 3.9 TFLOP
  • 1.5 GHz base clock
  • 6 GB VRAM
  • $250 retail
Seems reasonable, another $50 for a decent jump in power and double the memory! But, as we’ve learned, we shouldn’t look at the specs for the full story.
I did do a GPU Boss comparison, but for the BF4 frame rate, I had to look at Tom’s Hardware (sorry miners, GPU boss didn’t cover the mHash/sec spec either). What’s the verdict? Well, pretty good, I’d say. With 97 FPS, a 79% increase over the 1050— wait. 97? That seems too low… I mean, the 3GB version got 99.
Well, let’s see what Tech Power Up has to say...
94.3 fps. 74% increase. Huh.
Alright alright, maybe that was just a dud. We can gloss over that I guess. Ok, one more, but let’s go for the big fish: the GTX 1080.
  • 9.0 TFLOP
  • 1.6 GHz base clock
  • 8 GB VRAM
  • $500 retail
That jump in floating point speed definitely has to be something, and 4 times the VRAM? Sure it’s 5 times the price, but as we saw, raw power doesn’t always tell the full story. GPU Boss returns to give us the run down, how do these cards compare in the real world?
Well… a 222% (over three-fold) increase in mHash speed, and a 218% increase in FPS for Battlefield 4. That’s right, for 5 times the cost, you get 3 times the performance. Truly, the raw specs don’t tell the full story.
You increase the cost by 27%, you increase frame rate in our example game by 22%. You increase the cost by 83%, you increase the frame rate by 83%. Sounds good, but if you increase the cost by 129%, and you get a 79% (-50% cost/power increase) increase in frame rate. You increase it by 358%, and you increase the frame rate by 218% (-140% cost/power increase). That’s not paying “more for much more power,” that’s a steep drop-off after the third cheapest option.
In fact, did you know that you have to get to the 1060 (6GB) before you could compare the GTX line to a PS4 Pro? Not to mention that at $250, the price of a 1060 (6GB) you could get an entire PS4 Slim bundle, or that you have to get to the 1070 before you beat the Xbox One X.
On another note, let’s look at a PS4 Slim…
  • 1.84 TFLOP
  • 800 MHz base clock
  • 8 GB VRAM
  • $300 retail
…Versus a PS4 Pro.
  • 4.2 TFLOP
  • 911 MHz base clock
  • 8 GB VRAM
  • $400 retail
128% increase in floating point speed, 13% increase in clock speed, for a 25% difference in cost. Unfortunately there is no Battlefield 4 comparison to make, but in BF1, the frame rate is doubled (30 fps to 60) and the textures are taken to 11. For what that looks like, I’ll leave it up to this bloke. Not to even mention that you can even get the texture buffs in 4K. Just like how you get a decent increase in performance based on price for the lower-cost GPUs, the same applies here.
It’s even worse when you look at the CPU for a gaming PC. The more money you spend, again, the less of a benefit you get per dollar. Hardware Unboxed covers this in a video comparing different levels of Intel CPUs. One thing to note is that the highest i7 option (6700K) in this video was almost always within 10 FPS (though for a few games, 15 FPS) of a certain CPU in that list for just about all of the games.
…That CPU was the lowest i3 (6100) option. The lowest i3 was $117 and the highest i7 was $339, a 189% price difference for what was, on average, a 30% or less difference in frame rate. Even the lowest Pentium option (G4400, $63) was often able to keep up with the i7.
The CPU and GPU are usually the most expensive and power-consuming parts of a build, which is why I focused on them (other than the fact that they’re the two most important parts of a gaming PC, outside of RAM). With both, this “pay more to get much more performance” idea is pretty much the inverse of the truth.

“The console giants are bad for game developers, Steam doesn't treat developers as bad as Microsoft or especially Sony.”

Now one thing you might’ve heard is that the PS3 was incredibly difficult for developers to make games for, which for some, fueled the idea that console hardware is difficult too develop on compared to PC… but this ignores a very basic idea that we’ve already touched on: if the devs don’t want to make the game compatible with a system, they don’t have to. In fact, this is why Left 4 Dead and other Valve games aren’t on PS3, because they didn’t want to work with it’s hardware, calling it “too complex.” This didn’t stop the game from selling well over 10 million units worldwide. If anything, this was a problem for the PS3, not the dev team.
This also ignores that games like LittleBigPlanet, Grand Theft Auto IV, and Metal Gear Solid 4 all came out in the same year as Left 4 Dead (2008) on PS3. Apparently, plenty of other dev teams didn’t have much of a problem with the PS3’s hardware, or at the very least, they got used to it soon enough.
On top of that, when developing the 8th gen consoles, both Sony and Microsoft sought to use CPUs that were easier for developers, which included making decisions that considered apps for the consoles’ usage for more than gaming. On top of that, using their single-chip proprietary CPUs is cheaper and more energy efficient than buying pre-made CPUs and boards, which is far better of a reason for using them than some conspiracy about Sony and MS trying to make devs' lives harder.
Now, console exclusives are apparently a point of contention: it’s often said that exclusive can cause developers to go bankrupt. However, exclusivity doesn’t have to be a bad thing for the developer. For example, when Media Molecule had to pitch their game to a publisher (Sony, coincidentally), they didn’t end up being tied into something detrimental to them.
Their initial funding lasted for 6 months. From then, Sony offered additional funding, in exchange for Console Exclusivity. This may sound concerning to some, but the game ended up going on to sell almost 6 million units worldwide and launched Media Molecule into the gaming limelight. Sony later bought the development studio, but 1: this was in 2010, two years after LittleBigPlanet’s release, and 2: Media Molecule seem pretty happy about it to this day. If anything, signing up with Sony was one of the best things they could’ve done, in their opinion.
Does this sound like a company that has it out for developers? There are plenty of examples that people will use to put Valve in a good light, but even Sony is comparatively good to developers.

“There are more PC gamers.”

The total number of active PC gamers on Steam has surpassed 120 million, which is impressive, especially considering that this number is double that of 2013’s figure (65 million). But the number of monthly active users on Xbox Live and PSN? About 120 million (1, 2) total. EDIT: You could argue that this isn't an apples-to-apples comparison, sure, so if you want to, say, compare the monthly number of Steam users to console? Steam has about half of what consoles do, at 67 million.
Now, back to the 65 million total user figure for Steam, the best I could find for reference for PlayStation's number was an article giving the number of registered PSN accounts in 2013, 150 million. In a similar 4-year period (2009 - 2013), the number of registered PSN accounts didn’t double, it sextupled, or increased by 6 fold. Considering how the PS4 is already at 2/3 of the number of sales the PS3 had, even though it’s currently 3 years younger than its predecessor, I’m sure this trend is at least generally consistent.
For example, let’s look at DOOM 2016, an awesome faced-paced shooting title with graphics galore… Of course, on a single platform, it sold best on PC/Steam. 2.36 million Steam sales, 2.05 million PS4 sales, 1.01 million Xbox One sales.
But keep in mind… when you add the consoles sales together, you get over 3 million sales on the 8th gen systems. Meaning: this game was best sold on console. In fact, the Steam sales have only recently surpassed the PS4 sales. By the way VG charts only shows sales for physical copies of the games, so the number of PS4 and Xbox sales, when digital sales are included, are even higher than 3 million.
This isn’t uncommon, by the way.
Even with the games were the PC sales are higher than either of the consoles, there generally are more console sales total. But, to be fair, this isn’t anything new. The number of PC gamers hasn’t dominated the market, the percentages have always been about this much. PC can end up being the largest single platform for games, but consoles usually sell more copies total.
EDIT: There were other examples but... Reddit has a 40,000-character limit.

"Modding is only on PC."

Xbox One is already working on it, and Bethesda is helping with that.
PS4 isn't far behind either. You could argue that these are what would be the beta stages of modding, but that just means modding on consoles will only grow.

What’s the Point?

This isn’t to say that there’s anything wrong with PC gaming, and this isn’t to exalt consoles. I’m not here to be the hipster defending the little guy, nor to be the one to try to put down someone/thing out of spite. This is about showing that PCs and consoles are overall pretty similar because there isn’t much dividing them, and that there isn’t anything wrong with being a console gamer. There isn’t some chasm separating consoles and PCs, at the end of the day they’re both computers that are (generally) designed for gaming. This about unity as gamers, to try to show that there shouldn’t be a massive divide just because of the computer system you game on. I want gamers to be in an environment where specs don't separate us; whether you got a $250 PS4 Slim or just built a $2,500 gaming PC, we’re here to game and should be able to have healthy interactions regardless of your platform.
I’m well aware that this isn’t going to fix… much, but this needs to be said: there isn’t a huge divide between the PC and consoles, they’re far more similar than people think. There are upsides and downsides that one has that the other doesn’t on both sides. There’s so much more I could touch on, like how you could use SSDs or 3.5 inch hard drives with both, or that even though PC part prices go down over time, so do consoles, but I just wanted to touch on the main points people try to use to needlessly separate the two kinds of systems (looking at you PCMR) and correct them, to get the point across.
I thank anyone who takes the time to read all of this, and especially anyone who doesn’t take what I say out of context. I also want to note that, again, this isn’tanti-PC gamer.” If it were up to me, everyone would be a hybrid gamer.
Cheers.
submitted by WhyyyCantWeBeFriends to unpopularopinion [link] [comments]

Telling anyone to not be upset at Vega's pricing is ignoring the real problem here

TL;DR: There is no MSRP for Vega. It officially doesn't exist. There is only SEP, and it just muddies the waters further.
I know there will be some people who pull out the "free market" excuse, saying that AMD could do whatever they liked with pricing.
Yes, of course they can.
But let's look at why this is such a problem. First, they skirted their way out of any legal issues by using SEP instead of MSRP. MSRP has some legal basis behind it in US law, while SEP is literally just a made up term based on studies that show that future pricing at e-tail may have to change and become dynamic to keep the economy healthy and profitable.
Then, they didn't correct reviewers when they pushed out their price/performance metrics for Vega 64 and 56 at $499 and $399 respectively. It looks better for AMD with that pricing attached, and very few reviewers are going to go back on this. So, on paper, Vega 56 starts looking like a real winner.
With only SEP pricing out in the wild, no-one really can say what Vega's MSRP should have been. AMD has left things open for them to bump the price up or down as needed to spur sales or profit from mining runs. At $599 for Vega 64, they're earning enough money to be profitable. $499 would arguably not leave them enough headroom with every other company also needing to take a slice out of the pie.
In a technical sense, this would not reflect badly on AMD given current GPU shortages that have resulted in price gouging anyway. Finding high-end cards in my country is already difficult with miners sucking things dry thanks to Bitcoin's recent surge. If AMD is selling out of Vega with the mining craze, and making bank, it shouldn't in theory look bad to their investors because they shouldn't worry about where the sales come from, should it? The games bundle doesn't have "free games", it's just games tacked on resulting in a price increase. They're not losing money from that.
But AMD's target market isn't miners. Lisa Su said it herself that mining isn't a growth driver for them. Yet they still target it, leaving gamers without a competitive option at the high end, and without any mid-range cards to buy because they're not going to ramp things up in case the bubble bursts suddenly. When it does burst, they won't see increased sales because of the second-hand market flood.
If they choose to respond to that, they're damned if they do, and damned if they don't. So they're doing nothing to bring up Polaris stock. And now they're legally wriggling out of anything that can be complained about when it comes to Vega's pricing, because they're going to sell out no matter what, and more money is better than feel-good marketing campaigns with "real gamers" in the "rebellion" and ecosystem sales like a FreeSync monitor.
Legally it's a grey area. Morally, it's downright wrong. But since they're a company, and a company doesn't necessarily need or have ethics or morals, there's no downside. The only loser here is the gamers, because they have nothing to buy and only NVIDIA to support.
submitted by CataclysmZA to realAMD [link] [comments]

A quick guide to GPUs and cryptocurrency.

There seems to be a lot of misinformation going on about cryptocurrency and how it ties into the new and used GPU markets. I want to try to help alleviate some of the false rumors as well as put together a quick resource to answer any questions someone may have. This is not to sway anyone to or away from crypto or anything like that. I just keep seeing threads that go something like this:
OP: GPU prices are so high right now C1: Oh it’s because of bitcoin OP: elaborate C1: bitcoin is mined 24/7 using huge GPU farms. The miners snap up any and every new GPU they find. OP: well I’ll just buy used! C1: bad choice jabroni. The miners run these cards so hot and heavy that they’ll instabreak as soon as you buy them. OP: oh well when will it stop? C1: bitcoin just crashed so we should see some new GPUs in stock soon OP: thanks dude!
This is an exaggerated exchange, but it is based on some of the more common things I see sometimes and it gets the point across. The fact is, a lot of people that comment are trying to be as helpful as possible, and that is absolutely a good thing. The problem comes when trying to water down the information to make it easier to comprehend for a beginner. That’s where I’m hoping this guide comes in. I’ll start off by explaining the process of mining as concisely as possible while still keeping all of the relevant information. After that, I’ll try to debunk/explain some of the myths surrounding mining and GPUs. Let’s hop in.
Mining in a nutshell:
Cryptocurrency mining is the act of using a cpu, GPU, or an ASIC (specially designed device) to solve mathematical equations that verify transactions. That’s a very general definition that we can work with. Within the general aspect of mining, there are different algorithms that are used for different currencies. Some currencies use the same algorithm, so you end up with more currencies than algos. Some algorithms are “ASIC-resistant”, so the best way to mine is with a GPU.
Now how do you actually get paid by mining? That’s simple. Generally speaking, any time any cryptocurrency is sent anywhere it must go through a verification process and will incur transaction fees. Who gets those fees? The same people that verify the transaction, miners. That simplifies the process a good bit, but it is still truthful.
That’s a good bit of background information. Let’s traverse a bit deeper and cover how somebody gets into mining. I’ll start by saying this: Hardly anybody mines Bitcoin. A little more emphasis for the people in the back. HARDLY ANYBODY MINES BITCOIN. Why is that? Because Bitcoin is not profitable unless you are using a high dollar ASIC. That is beyond the scope of this guide, so I’m not going to go there.
There are two ways for the Everyman to mine: choosing a coin and mining through a pool, or using what I call a “smart miner” like NiceHash. For the sake of simplicity, let’s use NiceHash from here on. NiceHash is mostly on par with the profitability of pool mining, and it is also a lot easier to set up. This simplicity is inviting to beginners and experts alike.
With NiceHash, you never actually mine coins for yourself. You sell your hashing power to those that are buying hashing power. You switch between whichever algorithm is most profitable and mine whatever coins the buyers are paying for, however, you are paid in bitcoin. Confusing? Cool, let’s get simpler. Say you have any amount of GPUs in a rig. In order to mine, download NiceHash, set up your account, and press the start button. Now you’re mining with power! In order to make this profitable enough, you’re going to want a rig with multiple GPUs. A common goal is 6-8 GPUs per motherboard w/ multiple motherboards.
So we’ve got all the background we need for now. Let’s tie in some market information. Currently almost all crypto currencies are down significantly, but that doesn’t change the fact that it’s incredibly volatile and could skyrocket in the next week. We have to deal with the fact that yes, crypto is down, but most people will continue mining because it will inevitably go back up. We may potentially see a drop off in miners buying cards until the return of investment goes back up. It’s kind of dismal right now.
Myth Madness:
M: Bitcoin is causing GPU prices to skyrocket.
E: Let’s replace a few words here. This is more accurate: “cryptocurrency is a big factor in the increase in GPU prices.” This is better, but still doesn’t tell the full story. While cryptocurrency is jacking up the demand for GPUs, a RAM shortage as well as the fact that manufacturers just can’t keep up is keeping supply down.
M: Miners snap up every available new card.
E: This one is pretty true, however, most miners aren’t just snapping up these cards at MSRP, they’re paying the same price everybody else would pay. In the market’s current state, the higher prices of GPUs hurt the return of investment especially when the dollar value of cryptocurrency is down.
M: Miners are only interested in AMD!/Nvidia!
E: this one is a common misconception. Some cards do outperform others, but that does not mean 6 rx 580s will outperform 6 gtx 1080s. Generally speaking, the most powerful card for gaming will be the best performer for mining. A 1080ti will earn around 6 USD per day while a 580 may earn 3 USD per day. Return of investment is actually pretty similar if you go by msrps. You pay more, you earn more to an extent. Most miners that are in it purely for profit are going for 1060 6gb and up, or 570 8gb and up.
M: Mining cards are run at full power 24/7 and are useless after they’re done with them.
E: The 24/7 part is true. Most rigs run constantly, but almost all miners undervolt their cards. The power consumption doesn’t do much to the hashrate, but it cuts into overall profit by drawing more electricity than it actually needs. In addition to this, heat is still an enemy as it is with a high performance gaming rig. To combat this, most rigs are in a cold room with as much airflow as possible in the form of multiple case fans, external box fans, ceiling fans, even portable AC units pointed directly at the cards. Miners are all about protecting their investments. The only wear that could be a problem is fan wear, which can be an easy fix.
M: This will pass when bitcoin crashes.
E: Cryptocurrency is in it for the long haul. 2017 was honestly the first year of widespread acceptance of crypto. It’s best to look at return of investment when talking about crypto and GPUs. With crypto down, GPUs won’t be bought as much. GPU prices go down, return of investment gets back to what it was. More GPUs are bought. It’s a huge ebb and flow type of thing.
Let’s go back to the original exchange from the beginning and see how someone could potentially answer the OP.
OP: GPU prices are so high right now. C1: unfortunately, we’re in a phase where manufacturers can’t keep up with the demand for them. OP: why’s the demand so high? C1: a big reason is cryptocurrency. Crypto miners are able to make a big profit off of mid-high end cards, so they buy them as fast as possible. Lower end cards are effected as well because gamers that can’t get the mid-high range are settling for low range to fit in their budget. OP: Well how about used? C1: If you’re getting a current gen Card, you should ask plenty of questions regarding the conditions it was in. Last Gen cards like a 970 or 980 are probably alright and will rival the performance of some 10 series cards. OP: What if I wait it out? C1: I wouldn’t advise waiting it out. It could take several months for this phase to pass. I would buy a 970 or 980 to hold over until prices even out.
That’s a lot more informed as well as being a lot more helpful to the OP. The best advice I can give if you want a high end card is to be vigilant about deals. Some places are offering GPUs at MSRP if you buy other components with it. That’s a great option if it’s your first build. All in all, we’re in a pretty troublesome time for PC building with the price of ram going up as well as the price of GPUs.
I don’t claim to be an expert so if there are any corrections or additions that need to be addressed, please let me know and I’ll fit them in.
submitted by maybemao to pcmasterrace [link] [comments]

I had like 3 friends ask me how to build a PC in the past week so I made this to help them. Feel free to use or send me an e-mail if you want the txt file

(Reddit Edit: Help my improve the document with productive comments on what I got wrong or messed up! Im only human lol
Also a lot of this is supposed to be kinda humorous. I didn't think I had to say that but, hey, its the internet.
I appreciate the positive and productive comments! )
Beginners basic guide to building your own PC as of early 2018
(EDIT: Sorry for being a MSI/Corsair Fanboy)
Heres a collection of thoughts to consider when building your own personal PC
As always Id personally use PCPartPicker.com to configure your parts and for further thoughts on compatibility.
First off building a computer is 100% based around what you plan to use the computer for.
Here are a few uses and generic ideas of what to go for. Audio Editing: Lots of small tasks that need to be completed quickly without lag. - Fast Processor( >4GHZ) - Fast RAM (MHZ) -At least 16 gigs! - Fast Storage, SSD manditorily - M.2 or PCI for best performance. - Shitty Graphics card, graphics card there only to keep the cpu from doing other tasks when working. - Can be a few generations or years old. - Many screens for lots of plug in windows to be open Video Editing: Lots of large to render and files to read. - Multi core processor the more the merrier - SSD for fast read/write of large video files. - Insane graphics card, AMD graphics cards are debatibly better but the nvidia Quadro series are specific for video rendering. Gaming: No more than 4 cores intense graphics card - 92% of games are not coded for more than 4 cores so why spend the extra money for it. - SSD for quick load screens - Nvidia cards, 10 series, the higher the number the better. Titan cards for MAXIMUM OVERDRIVE! Coding: quick processor for lots of small tasks. Ergonomic peripherials? - Dear god please dont use a mechanical keyboard so that your coworkers dont kill you. Home office: Everything can be a few gens behind so you can get the best power per dollar spent. - Sorry that Gateway doesnt exist anymore. I guess try Dell... 
Parts (Expensive Legos)
CPU (tells things to go places and outputs data) Basically three main routes to go for: Intel, AMD, or ASIC. Intel - Gaming, Data center, Hackintosh Pros: Cooler, Faster speed (GHZ), short small tasks faster Cons: $$$$, less cores AMD - Gaming, Personal Computing, Large task processing Pros: Lots of cores, better price per performance, faster processing of large tasks Cons: Hot chips, large chips?, compatibility issues with MacOS. ASIC - "Application-specific integrated circuit" Pros: Does the task that they are made to do insanely efficently, great for mining. Cons: Literally does nothing else. Holy hell these are expensive, very hot (fans will get loud) CPU Cooler (Im a big fan) Most come with an in box cooler that are ok but please buy aftermarket. In Box - the free shitty cooler that comes with the processor. Pros: Free. Cons: Ugly, makes chip run hot, hard to clean Air cooler - oldest type of cooler but new designs are highly efficent. Pros: Only cooler that has the possibility of being 100% quiet, most likely cheaper Cons: large, if cooler isnt large enough for the chips thermal output the fans will be loud. Liquid - Custom pipes are beautiful, AIO is easy to install and offers similare performance. Pros: Looks cool, great temperatures, "quiet" Cons: Water pump has possibility of being loud, possible spills Phase Change - uses the technology of refridgerators to cool the chip Pros: Can overclock until the chip breaks. (whats colder than cold? ICE COLD!) Cons: Loud (compressor noise), Large pipes, just why.... Motherboard (the convienacnce store of computer parts) Really just about what type of I/O you want. - MAKE SURE FORM FACTOR FITS YOUR CASE! (or vice versa) - Look for PCI lanes for expansion. - How many graphic cards do you have? - PCI based interfaces? - PCI SSD? - PCI DAC? - PCI WIFI? - USbs? Network? Audio? - How many lanes of RAM? - DOES IT FIT YOUR PROCESSOR!?! (really tho) - M.2? - How many sata interaces? Good Brands: MSI, ASUS, Gigabyte Bad Brands: AS(s)Rock, Dell Memory (Dory) - The more the merrier - No less than 8gb for a functional windows machine (16 gb to never have a problem) - Use all the lanes your computer has to offer! the more lanes to access the faster the data can travel! -Imagine drinking a milkshake. If the straw is wider you can drink more of the milkshake than a skinny straw. - Faster MHZ for faster data access but give minimal performance differances - Please get ram with heat spreadders unles youre building a server with high airflow. - Make sure the type (DDR3 or DDR4) of RAM matches what your processomotherboard call for. Good Brands: Corsair, G.Skill, Ballistix Storage (Grandpa that remembers everythign about how things used to be but takes forever to learn a new tasK) Speed or massive storage? slower is cheaper. Golden ratio of speed/storage/price is 250-500 gb SSD and a 1+ tb disk drive. *Max speeds listed are for a single drive not RAID* Hard Disk Drives (HDD) - Cheapest and slowest - read/write speeds of < 0.5gb/s - 7200+ RPM or GTFO - Higher Speed drives can access data faster. - Do not move while powered up. physical parts will break. - Larger Cahche = faster Read/Write Speeds Pros: Cheap, Holds massive amounts of data Cons: Slower than molasses in a frezer Reputible Brands: Seagate, WD Solid State Drives (SSD) - necessity for quick boots and fast load screens (can only be re-written to so many times) - SATA based (2.5 inch)- Read/Write speeds capped @ 6 gb/s Pros: Most economical, form factor fits with old computers, Cons: "Slow" compared to other ssd's (but stil 12 times faster than a HDD) - M.2 based - Read/Write speeds capped @ 10 gb/s Pros: Size of a sick of gum! High End but not too expensive to be out of reach. Cons: Expensive for any size over 500 gb - PCI based - Read/Write speeds capped @ 20 gb/s for PCI3, x4 Pros: HOLY BANDWIDTH BATMAN! Faster than that little creepy ghost thats always in the corner of you eye Cons: You might have to take out a loan to buy one. *takes up a x4 PCI Lane* Reputible Brands: Samsung! Corsair, Plextor, Intel, Kingston, Crucial Video Card (that one kid that has thick glasses and is really good at math) - A regular old PCI card that handles all of the video rendering and output for your computer. - ASIC PCI cards. - The PCBs and chips are patented by two main companies but the differances come from line up and varying manufacturer cooling devices. - The more memory the better -NVIDIA (Team Green) Great for gaming, has specific card series for intensive rendering. Lazy driver updates. - Gaming - 900 series - Cheap - Low performance - Can play any video game made befrore 2010 on max settings - 1000 (ten) series - Expensive (thanks bitcoin miners...) - Great for VR! - Video Rendering -Quadro Series - Gaming and Rendering - Titan X - Maxwell based chip same as 900 series cards - Titan XP - Pascal based chip same as 10 series cards -AMD (Team Red) Underdog does the same thing but slighly worse and cheaper. (except video rendering) - Gaming - RX 400 series - Cheap - Hot - RX 500 series - Cheap - Ok at VR and deacent gaming frame rates. - Not bad but not particularly great either. - Video Rendering - Fire Pro series - Gaming and Rendering - Vega series -Good luck finding one to buy lmao Case (Fancy clothing for your parts!) - Similar to human clothing you want it to do a few main things really well with compromises for each extreme. - Durability - Steel - Incredibly durable - Creates Farady cage for components - Heavy af - Magnets, just magnets.... - Rust over time - Aluminium - Light - East to bend for modding or "physical maintenance" - Less likely to rust - Huzzah for Farady cages! - Plastic - Just dont - no electrical Ground - no faraday cage - Light AF! - Breath (Airflow) - positive internal airflow! - larger fans push the same amount of air with less speed/noise - Looks - Window? - RGB - Cool Paint? - Fit all your parts - graphics card length/ clearacne - support for liquid cooling raiators? - How many spots for HDD/SSDs - Motherboard format - Cable management! Power Supply (FIGHT MILK) - Rule of thumb: BUy Powersupply that outputs 1.5 times the wattage that you need. - You can walk further than you can you can run. - The PSU can casually output 50-75% power for much longer than at 90-100% (without failure) - If you never demand enough wattage for it to get hot the fan doesnt have to turn on therefore making it quieter. - Modular means you can remove/replace the cables from the PSU. Reputible Brands: Corsair, EVGA Optical Drive (motorized cup holder) - You can download most things today so I'd suggest against it unless you really NEED to watch/write DVD's/CD's Operating System (software that makes everything work) Windows (Always Updates) - Compatible with just about everything - Easy to learn to code on! - POS inital browser - Likely to get virus's Linux (Penguins are cute) - Unique - takes less resources to run - Barebones - Incredibly personalizable! - Compatibility issues with just about everything MacOS (Linux but more annoying) - It is legal! - Great for art and your grandma that doenst know how to use computers! - User friendly - Compatibility issues with various hardware - Confusing/Limiting coding structure Peripherials (cables everywhere!) - Keyboard (higer Polling rate is better) - Mechanical (key is pressed at an exact stroke length every time - Mouse (Higher Polling rate is better) - more buttons = better? - DPI (Dots Per Inch) - In theory, if a mouse has 1600 DPI, then, if you move your mouse one inch (2.54 cm), the mouse cursor will move 1600 pixels - Higher DPI the faster your cursor is able to be moved. - Monitor - In theory the human eye cant see faster than 60 frames per second. - Keep in mind Pixel ratio! - 4k screen that is 22inches will have more pixels in a square inch than a 4k screen that is 28 inches. - Interface? - DVI (Analog) - thumbscrews..... - can do two monitors with one port! - support for 4k - VGA (Analog) - thumbscrews... - max resolution is 1440p - Display Port (digital) - nice button clip - supports 4k - HDMI (Digital) - 1.2 or higer supports 4k - DAC/Speakers/Headphones - Dont even get me started - Microphone - Dont get me started PT.2 Other (other) - UPS (uninterruptible power supply) Just a battery that allows your computer to have some time if the power ever goes out so that you have time to save your work. - Cable Organization materials! - Zipties - velcro - LED LIGHTING! - Manditory - Extra/Better fans - More pressure, less woosh - IFIXIT Pro Tech Toolkit - becasue who buys just one torx wrench. - Cute kitten mousepad - Yes, it has to be a cat. Dont argue 
This is a very general entry into building computers and what you should buy/look for. If you have any questions/comments send me an e-mail!
-Zac Holley-
submitted by Zac_Attack13 to buildapc [link] [comments]

[Discussion] My own personal guide to used hardware alternatives.

Hi there. My name is Jeff. I've been building systems for the better part of 15 years and try my best to contribute here actively. After being involved in this little community for a few years now, I'm noticing a serious lack of discussion about buying used components, and I feel like it's time to shed a little light on the subject for those looking to build on a (seriously) tight budget.
As Linus said in his scrapyard wars video, buying new on $300 isn't practical, and if you posed the challenge to him on a random day, buying used is almost certainly the path he'd choose. As someone who's been "scrapyarding" as a hobby for the better part of 10 years, I figured I'd take some time to share some of what I've learned for the modern audience.
Let's begin with a simple rundown of modern "budget" choices, and I'll tell you what I'd do instead.
CPU
The G3258 and Athlon 860k are the sub-$100 CPUs of choice, and both work just fine. I have built with both in the past, and each carries their own set of advantages.
Used Alternatives: You can go in a couple of directions here; if you happen to have an LGA 1366 motherboard lying around, you can get an i7 920 or better for under $50, and they still hold up reasonably well. Being that LGA 1366 boards are not typically cheap when purchased used, my favourite option is the Phenom II x4 Black Edition series, each of which compare favourably to modern budget options, and will even overclock on some incredibly dated, dirt cheap AM2+ boards. In my experience, eBay prices on these get a little too high for my taste, but I've been able to nab several on Kijiji locally in Toronto for under $50 as well.
GPU
The R7 260x and GTX 750 ti are often cited as budget options for most builders, with the latter serving a very specific role in systems where power draw might be a concern. While there exists no option that can complete with the low consumption of the 750 ti (or even the single 6-pin connector goodness of the 260x), its performance can easily be matched (and exceeded) for less money.
Used Alternatives: The bitcoin mining craze from a few years back led to the Radeon 7950 and 7970 being blacklisted on the used market, and I think the fears about burned-out cards are a little overblown. Here in Toronto, you can easily grab a 7950 for the price of a 260x, but I don't pay anywhere near that for my builds. At most, a Windforce will cost me $125, as where I recently picked up some non-boost edition PowerColor versions for a mere $83 each (bought 3 for $250).
EDIT: Forgot to mention something important - avoid the reference 7950 and 7970. They were employed to a far greater degree in mining rigs because of their rear-only exhaust, and if you see a bunch of them from the same seller listed at once, they're likely old mining cards. Only pick them up if they're incredibly cheap.
Want to go even cheaper? The Radeon 6950 (with the shader unlock, preferably) or even the 6970 will rival the performance of the 260x, and shouldn't cost Canadians more than $50-$60. I personally have 2 in my possession right now, and have gone through at least a dozen in the last 6 months.
In general, one should always avoid Nvidia when buying used, because they are far too popular and overvalued for their performance as they age. I still see GTX 660s selling for $150, which is absolutely absurd.
Motherboards
Motherboards on the used market are weird, and this can largely be attributed to the fact that they're hard to transport and don't handle well over time. As such, people don't really sell boards on their own that often, and you'll likely have more luck finding a combo of some kind (or even a ready-to-go tin-can with no graphics card) for less per part than you will finding a given board on its own.
Used Alternatives: The boards I'd recommend depend entirely on the CPU you've chosen. Being that I'm a fan of the Phenom II x4 series, AM2+ boards are going to be dirt cheap, but DDR2 RAM is actually fucking expensive, so you'd likely be better off going with AM3. I've even seen some used AM3+ boards (The 970 ASRock Extreme3, in particular) for as low as $40, so it wouldn't hurt to look.
On the Intel side, you're actually at a significant disadvantage. Much like Nvidia cards, Intel boards (and CPUs) actually retain their value and don't often come cheap. For me, LGA 1156 is the price/performance sweet spot, granted I can find an i7 8XX to go with it. Even still, they're going to run you a fair bit more than an AMD board, and likely aren't worth it by comparison.
RAM
Ram is ram. DDR2 is pricy as fuck due to an obvious market shortage of the stuff, so the AM2+ board option might not be best by comparison. DDR3 ram, however, is ubiquitous, and I always die a little inside when people building on a "budget" choose to buy new at all. If I'm being honest, I can get DDR3 ram from e-waste recycling companies for as low as $10 per 4GB stick, at 1333MHz, and not once have I ever had a bad stick of the stuff. Even for people going the route of the G3258 (which only supports 1333MHz), this is the clear winner.
Is value RAM ugly as sin? Sure it is. It is just as good as that fancy Ripjaws shit you've got in your current build? You betcha.
Storage
Hard Drives are actually a tricky game, as they are the single most volatile component in any budget build, easily succumbing to wear and tear from age and daily use. As such (and some might find this hard to believe) I actively avoid HDDs when building value systems for people and opt for cheap SSDs instead. As always, check the date on a drive if you're really insistent on buying one, and considering how cheap a WD blue is new, don't pull the trigger on one unless it's for less than $30/TB.
SSDs are obviously (akin to RAM) highly resilient and are nearly guaranteed to work when purchased used. The average SSD pulled from an old laptop or an office off-lease desktop, will have no more than 100GB of writes on it, which leaves 99% of its life for you to exploit. While there exists no specific recommendation for which brand to buy, just be sure you're getting a relatively good drive with SATA III capability. 120/128GB variants of these sorts should cost you no more than $50 in my native Canada, and I've even gotten lucky on some larger sizes too. Recently I picked up 4 256GB Samsung 840 Pros for $75 each (I came), just days after I bought a Crucial MX100 of the same size for $85.
Monitors
Monitors are fun to buy, because the recent shifts in display technology have rendered a lot of recent-but-obsolete models nearly valueless. For example, remember when 16:10 was a thing? I actually still like 1680x1050 monitors, but the rest of the world seems to disagree, so I've been able to pick up 23" variants for as little as $40. Being that the slightly lower resolution actually eases the strain on your VRAM a bit, it's a nice fit for a lot of budget cards that might not have a full 2GB available, like some variants of the 6950. 1600x900 monitors are often just as cheap and come with the same inherent benefit of being obsolete despite being almost as good as its bigger 1080p cousin.
Keyboards and Mice
If you're on a budget, we can't even have this discussion. As much as I like mechanical keyboards and high-precision gaming mice, people building used $300 systems aren't going to allot any of their budget buying them. That said, wired USB keyboards and mice are virtually free (search your local goodwill or value village for some), and if you have to pay money, buy a wireless combo for $20 new from some little shit store in a suburb somewhere.
Cases
Cases on their own sell for about half of their original retail price, give or take based on the condition. I normally just get them as a part of a tin-can bundle and make use of them if they aren't too dirty, but when building for someone else, I'd often just prefer to buy a new budget case in the $40 range.
PSUs
I saved this topic for last, because it's by far the most difficult category to master. First off, you really need to do your research and understand how PSUs work before delving into these guys, as the cost associated is almost entirely dependent on how resilient the underlying platform has been proven to be. Generally speaking, reading reviews on JonnyGuru and HardOCP is a great start, but none of them account for units that are several years old.
As a general rule of thumb, I use the EVGA 500W W1 as a reference point, and build my value tree around that. In other words, if a new EVGA 500W (a passable, proven budget unit) is cheaper than a used 500W variant of a better brand, why would I bother buying used? Sure, that 520W Seasonic S12II puts the EVGA to shame in terms of voltage regulation and ripple suppression, but can I really make the same claims of a unit that's 5 years into its life? Wouldn't I just be safer buying new? These are all factors you have to consider.
For me, the threshold lies around 50% in terms of cost savings vs. risk. In other words, if you can find a used quality unit for less than half the price of the cheapest quality unit available at a given time, buy it.
Anyhow I think that covers everything. And as a closing note, remember to be safe. Meet potential sellers (and buyers) in public, well-lit places, and try your best to avoid entering someone's home without some protections in place. Also, the more info you get about the person (address, phone number, etc) the less likely it is that a person will be trying to scam you. People who purposely conceal their identity do so for a reason.
Also, feel free to ask me anything about my own experiences buying and selling used. I've been doing it as a hobby for a long, long time and have sold many value builds to people who can't otherwise afford PCs. I'm happy to impart any wisdom I might've gained over the years.
Edit: CPU Coolers! Forgot those. Air coolers are a safe bet. They're slabs of copper and aluminum with fans strapped to them. Buy with confidence, and seek one out for $10-$15 if you plan to overclock. AIO water cooling is not so safe. Those things are typically only good for 2-3 years, and you have no idea how much longer a pump has before it gives. Budget builders likely aren't water-cooling anyhow, right?
Edit 2: Just to be clear, when I said I'd been doing this for a long time, I should clarify that a) I once owned a game store and sold systems out of there and b) I currently resell systems out of my house to raise money for charity builds. I really don't want people to get the impression I'm trying to sell anything.
submitted by Oafah to buildapc [link] [comments]

Advice? Going a bit insane.

I have a simple set-up I've been trying to get running stable for about a week now. I have 2x nVIDIA GTX 980 and 2x Radeon 29 280X cards that I want to use.
I have previously used just the GTX 980s and they were stable for weeks. I recently acquired the 280s, and while I can't be certain, I think the cards are not at issue.
I also had two of the 980s and one 280 running in my desktop stable for quite some time, just as a tester. But for the actual mining rig, I was using old AMD AM3-based hardware. And I've had nothing but problems. Murphy's law.
First, I had a motherboard go out (ASUS M4A87TD EVO.) Luckily I had a spare. But the ASUS board had two native PCI-E x16 slots (one running at x1 speed,) and so I had been running the 980s both in the motherboard, and was using this riser extender to run the other cards, with molex->SATA risers (I don't have enough native 4-pin connectors to run without molex to SATA adapters.)
The "new" board (which is actually chronologically older and older tech) is a Gigabyte GA-MA785GM-US2H. It's also socket AM3 but only supports DDR2 memory; luckily I had old DDR2 available. This board also only has one x16 slot, so I am running 3 GPUs outside of the chassis with x16 risers.
It turns out after some nightmare of troubleshooting that one of the DDR2 DIMMs is probably bad. I don't have a replacement at the moment, but I've tried running everything with one DIMM.
I also have Virtual Memory (swap/page file) set to 16GB. I don't know why this is necessary, but other people mention it and NiceHash wouldn't let me run 4 cards without it.
Anyway, all works "fine;" I have the 280s set to 850MHz core clock and +20 power with 100% fans, and 100% fans and native clock on the 980s.
But I keep getting crashes. Before I had freezes, so I also changed out power supplies several times and tried everything you can think of. I don't think the issue is down to power any more, but I'm open to correction if anyone has a suggestion or tip.
For power, I'm using a 3 to 1 ATX-24 pin adapter, with a SeaSonic 620W as the master, and a Corsair 550W as the "slave." The SeaSonic powers the motherboard and the ATX12V for the CPU. It also powers the 3 SATA to molex adapters for the x16 risers, and the 8+6 pin PCI-E power connectors for the 280s.
The Corsair powers the HDD (this I have also suspected of causing problems but I have swapped HDDs and had the same issues, and I really need to get an SSD for this thing,) two 120mm intake fans, two 140mm exhaust fans, and the 6 pin PCI-E connectors for the 980s.
It doesn't seem to freeze anymore. It will mine just fine for a while but eventually I will get a BSOD, usually saying that the device driver has gone into an infinite loop. I believe this is the Crimson driver (I am using version 15.12.)
I've swapped CPUs and memory and HDDs and motherboards and I'm just almost at my wits' end about to tear my hair out trying to get this thing to run stably. I most likely just need to get all new mainboard/CPU/memory, but I can't do that at the moment, so I am looking for any advice/help I can get.
I would assume the driver is going into an infinite loop because it's polling hardware and not getting the response it expects or something like that, which could indicate a power or heat issue, but I don't know. And I've searched for the BSOD error and 90% of the results for "device driver went into an infinite loop" (paraphrasing) were related to AMD Crimson drivers. So that's been where my suspicion has been so far, and I've had other problems with getting the AMD stuff to work when I was initially setting it up. (And had problems with AMD/ATi drivers going back 20 years, ugh.)
So.. Any advice?
TL;DR: Running 2x GTX 980 and 2x R9 280X; keep getting BSOD about device driver going into an infinite loop after some time mining. Want to stab myself.
submitted by mutilatedrabbit to EtherMining [link] [comments]

[Build Help]Need a good PCIe Riser cable for a gaming graphics card

Hello, I was wondering if anyone knows a GOOD and RELIABLE pcie x16 riser cable? Most, if not all the cables I've seen on Amazon have only up to like 3 1/2 Stars, or have 4-5 stars but have barely any reviews. They are all mixed reviews. They each say either that it works perfectly fine, or that SOMETHING burnt out.
My case is a slim-line case (pre-built), so the new PSU that I will buy will not fit in it, nor will my graphics card. I don't want to get a new CPU/Motherboard YET, so I'm holding off getting a completely new case until later, that's why I want a pcie x16 riser card, so that my graphics card will sit outside my case.
Another thing, if say my motherboard has a pcie x16 3.0 slot, and my graphics card is pcie x16 3.0, but my riser is pcie x16 2.0, it will run at 2.0 speeds right? (or no?) (although I've heard at the moment it doesn't matter because most 3.0 cards haven't even filled up the 2.0 speed cap). I just wanted to clarify that because none of the cables i've seen specify "1.0 or 2.0 or 3.0", so I don't know what I'm getting. What if the cable is actually a 1.0? That would be bad news, or would it?
I intend to use the cable on an NVIDIA GTX 750 for gaming, not Bitcoin/Litecoin mining (mentioned that because EVERYONE uses risers for mining in which i'm aware that the cable DOES not impact performance).
Thank you for taking your time to read this, and I await your reply.
submitted by KingTarrion to buildapc [link] [comments]

Low framerates while GPU usage is low

Hello Tech support,
I have already sent a ticket to Sapphire, as well as ask on their forums, but no replies yet and I would also like answers from people not in the sapphire/AMD community.
I will copy/pase every bit of information I have given them (prepare for a wall of text haha) I will also answer all questions you have honestly :) Please help me as I have tried a lot and have gained nothing.
Here it goes.
Since I got my GPU about 9 months ago the performance of the GPU has been pretty bad, but I didn't notice it too much because I wasn't playing too many recent games. Now however, I started playing more graphical intensive games and the performance of this graphics card is lacking, almost as if there is a bottleneck in my pc (which should be impossible concidering the parts in my pc). For example, in borderlands 2 the framerate gets as low as 40 frames per second, or even lower, this while MSI Afterburner (or similar programs) tell my my GPU usage is under 50%. An even more extreme case of this is TERA Online, in which I sometimes only get 20 or even 10 FPS, all while my GPU Usage is under 30%. (My laptop gets around the same amount of FPS in the same game, same settings, but my laptop has a 6770m which gets about 100% usage). My PSU should suffice, my CPU is fast enough, my ram is fine, my temperatures are low (highest ever recorded is 65 degrees celsius. When running a performance testing program like 3d mark suddenly the GPU gets 100% usage, as does it when mining bitcouns (which I only tested to see how the GPU performs, I normally don't mine bitcoins.) Also, in NONE of my tests does my CPU have a high usage or are there heavy background programs. The ICT people I showed this made sure of that as well. I would like to know why I have this problem and how I can solve it. I have let ICT people take a look at it and they find nothing wrong. It can't be unoptimized games because my laptop gets high usages in the same game, and a friend of mine whom has a pc with the same CPU, but an nvidia GPU gets full usage as well. I am clueless about this problem and would like your support on this. Included with this ticket is a screenshot of me playing tera, with Task manager, MSI Afterburner and GPU-Z opened. I hope you can help me.
Included with this ticket is the following picture: http://i.imgur.com/Ac7QvW7.jpg
Also, here is a list of my hardware, as I'm supposed to give (also in the support ticket as extra information)
Main Hardware Information: CPU: AMD FX-8350 black edition RAM: 2 Sticks, each 4GB corsair vengeance 1600Mhz Motherboard: Gigabyte 990FXA-UD3 (revision 4.0) (on the latest bios) Video Card: Sapphire R9-280X Vapor-X Running at stock speeds (only changed power use to **50% to see if that made a difference, it does not) PSU: Corsair CX750M
Other Hardware Information : Monitor: Iyama 1920X1080 monitor, Standard 1280x1024 monitor as secondairy (not always activated) HDD/SSDs: Sandisk 64 gb ssd (Boot); Samsung EVO 480 120gb (games); WD black 1TB (games and data) Other PCI/PCI-X Cards: None Optical Drives: Standard CD/DVD drive (Super Writemaster SpeedPlus)
Software Information: Operating System: Windows 8.1, latest updates (also had this problem on windows 7) Video Card Drivers in Use ( and tried ): Currently 14.7 (always had GPU drivers up to date) Motherboard Drivers Used: Latest ones downloaded from website Apps/Games causing issues ( and if they've been patched ): Generally most of them All major components check 100% after thorough testing ( eg. Memory & Memtest86 ): Yes, every part in my pc is fine System clean of viruses / spyware etc: Pc pc checks regularly, using Avast Other things tried: Overclocking CPU (Will try an NVidia 770 soon when a friend of mine can miss it to see how the pc reacts)
submitted by astraeasan to techsupport [link] [comments]

Merry Christmas /r/buildapc! Got $500. I want to upgrade my GPU to something hardcore but I can't decide between nVidia or ATI. Help me choose please!

I'm an oldschool builder. I've had cards of both types over the years, but right now I just can't decide! I've had good cards and bad cards from both manufacturers.
Understanding this is quite a budget for a gpu, I want something beefy. I am going to build a new rig this coming spring, and I plan on putting this gpu upgrade in that when the time comes, but for the moment I wanted something to get me by until then. My current mobo is an EVGA P55LE with an i5 750 @ 2.67ghz. I don't overclock, but I am worried I might get some bottlenecking. If that happens nothing to be done but build the new rig when I can.
Main goal: Replace my GTX 560Ti. It's only got 1.25gb of vram and I'm realizing you need at least 2gb these days, ideally 3 or more if you're going to play with all the graphics maxxed at 1080p at a decent fps. That's why I'm willing to shell out more dough for a nicer gpu replacement. What can I say? I like me some good graphics.
Here are the pros and cons as far as I see them:
nVidia Looking at something in the 700 series. PROS: Stable and established. Less expensive (at the moment). Good numbers as far as benchmarking is concerned. After RMAing ( -_- ), my GTX 560 has been good to me, it just can't keep up with the latest and greatest software anymore. I admit the need to RMA left a bad taste in my mouth, but that shit happens, and this card has been quite decent, if a little power hungry. CONS: Apparently the new ATI r9 series is more bang for your buck... or at least it was until litecoin mining stepped in and fucked all that up. We'll come back to that. I've heard rumors nVidia is about to release something new though, so maybe waiting might be worth it, if anything just for the price of the 700 series to drop.
ATI Looking at the new r9 series. PROS: Across the board these cards seem to deliver more power for less money. Or at least they did until litecoin and bitcoin mining sold them out everywhere and jacked all the prices up (lookin at you Newegg... shame on you.). For the new rig I'm still going to stick with intel cpu, but I was thinking about switching to a gigabyte mobo, so getting a gpu that plays nice is important to me. CONS: Fucking litecoin. Seriously. Also, it looks like a lot of the r9s are having technical problems. Lots of RMAs from what I've been reading and plenty of unhappy people. Could it be that this new architecture was rushed a little perhaps? Might be worth waiting for the next gen to be released.
Anyways, that's where I'm at with this. I really want a new GPU since I've acquired a lot of new software over the holiday season, but I'm still stuck on whether or not to wait, and which route I should go this time.
Thanks in advance to anyone who shares opinions/wisdom! Happy Holidays /buildapc!
submitted by OkamiKnuX to buildapc [link] [comments]

[Table] IamA splat, editor/moderator/reviewer on overclockers.com and sysadmin at a cancer research organization. AMA!

Verified? (This bot cannot verify AMAs just yet)
Date: 2013-08-13
Link to submission (Has self-text)
Questions Answers
This just got cross posted to /sysadmin ; as a fellow research-field oriented sysadmin it gets worse... I too started in the Quake/HL/CS/TF timeframe, but got my degree in CompSci. Have you ever dealt with mice (the mammal kind; I've got worse stories)? Certs: just got my RHCSA this year. I've got the RHCE scheduled for october, and I'm studying for the CCNA, though I use HP switches.
How do you backup desktops / servers? Backups: Luckily, I don't do desktop support. We have another IT group that does that, I'm completely independent from them and I only have to take care of servers (and my own desktop). The physical servers are backed up to tape with Bacula. Our virtual servers are backed up with Veeam. My own desktop is backed up to my NAS share using synctoy (yes, i use windows on my desktop).
How much disk space do you have in one server? One off systems: As in physical servers built by hand? 0. I'm pretty much a Fujitsu shop with a few Dells. I definitely don't have time to be piecing servers together. disk space: only a few TB per server. I think the better answer would be that we have an Isilon X200 cluster that is 140 TB.
one off systems: As in physical servers built by hand? More as in unique software; such as this computer runs the HPLC. I guess in that case I only manage a handful of physical servers and a few VMs that are made for running one special piece of software or analyze data from one piece of scientific equipment. We have many other scientific devices that are attached to PCs that are "community" devices, but I don't have to manage them. and we've got a microscopy group that is separate from me too, with their own machines and devices.
If you are moving to 1gbs are you looking to increase the MTU? I was working on that but had some issues with firewalls for my windows-putty users. First, just to clarify, we're going to 10G from the 1G we have right now. I'm not our main network guy, so I'm not entirely sure but I doubt we'll change the MTU simply because we don't have a remote site so the majority of our traffic is regular internet traffic.
As for our backend network, I do use jumbo frames on a couple VLANs for our storage.
That most important question for any sysadmin...vi or emacs? Vi improved.
Anand Shimpi and Dustin Sklavos had an interesting podcast on the merits of Haswell on the desktop. In short, Dustin echos the enthusiast community's frustration with overclocking headroom decreases from Sandy Bridge Ivy Bridge Haswell. It seems like IPC has gone up but maximum frequency has gone down so the ratio seems almost 1:1. Then there is the issue of the use of TIM and IHS glue cap that caused some to delid their CPUs (and void their warranties). Question 1: What are your thoughts on the overclocking headroom decreases that we've seen? Question 2: Is Intel doing enough to cater to the enthusiast community? Question 3: How do you feel about the delay in the release of Enthusiast parts by Intel (Sandy Bridge-E & Ivy Bridge-E) versus mainstream parts (Sandy, Ivy, and Haswell)? Intel makes good chips and they do keep pushing technology forward, but they will never do overclockers any favors. They will always be doing whatever they can to make money. AMD will also do the same thing. Intel seems to think enthusiast solely means "deep pockets". At the same time, there always seems to be a lot of "the sky is falling" reporting done by many tech journalists. Intel hasn't completely forgotten about overclockers and I don't think they ever will completely let that group disappear. And really, what incentive does Intel have to completely lock out overclockers? Sure, deny us our warranty, we'll go ahead and buy another chip and give you more money. How could you deny that as a company? as for overclocking headroom decreases, one can only hope that means we've got a whole new architecture coming out soon, something like the transition from Pentium 4 to Core.
Do you have a home lab setup to learn/test on? If so, what does it consist of? At home I've got a 1u dell poweredge sitting in a closet which is my main server. I run bageez.us off it which was supposed to be my way of giving back to the community, by running a Linux torrent site. Other than that I've got two htpcs running Debian, a desktop windows machine for gaming/reviewing hardware, and a file server with 8 tb running Debian and KVM with a few Debian VMS.
Do you still have that site going? I tried your link but it didn't work. Looks like I let the SSL cert expire. I'll fix that tomorrow. It works on my end but I think I want to recode a few things and possibly get it to work with other trackers. Right now the torrents will only work with my local tracker.
Need to monitor that ;D. Yeah it's one of those things where I seem to be the only one visiting the site, so why stress about it. I also set up owncloud, but again, i'm the only one that uses it. :(
Do you get to keep the hardware you review? - Do you prefer the black theme or the white theme? Most of the time, yes.
Black. I don't mind the white theme that much tho. edit: he's asking about the forum default skin at overclockers.com/forums
What is your #1 piece of advise for any linux sysadmin? That's a tough one. Do you mean someone looking to become a sysadmin or one that is already a sysadmin?
I guess I didn't specify that did I? I ask the question because I've been doing mostly Windows sysadmin duties for about 2 years and some linux admin stuff. I'm falling in love with Linux and I would love to have a job dedicated to just *nix What advise/suggestions would you give someone that is wanting to make the transition? I think what really got me the best knowledge was forcing myself to use a "less polished" distro as my main rig for a few years. Once you are forced to learn, you'll learn quickly. Picking up an rhcsa book will help too even if you don't plan on taking the exam. Go through it and do the exercises. Install a distro, set it up, then format and do it all over. You can use virtual box for the same result without killing your main rig.
Do you still use FreeBSD? If so, what exactly do you use it for now? No, but I wish I did. I stopped using it because the GPU support in Linux was better on my desktop, and now I work mostly with CentOS, and it would be a lot of work to change 100ish servers over to FreeBSD.
What did you use to train yourself in everything? Just break and fix? Pretty much just the experience of using it daily on my desktop for years. Running gentoo and Slackware really gets you used to doing things for yourself.
Configuration management of choice for those 18 servers? I'm just a jack of all trades sysadmin with a strong focus on problem solving. Are you trying to cure cancer with those 18 nodes or mining bitcoins? I started playing around with puppet but haven't really gotten the hang of it. Right now the cluster is running ROCKS with Grid Engine, and I just use the rocks commands to provision/wipe nodes.
What's the hardest part about getting started with puppet? I think its mostly just finding the time to sit down and have enough time to emerse myself in it.
700+ centos nodes across a few clusters here and I'm loving ansible. Nice. I've heard that ROCKS becomes a bear at scale, but for now it's pretty simple and quick. My plan is to keep adding another 18 nodes every year (one full blade cluster) every year, as long as I can get funding, so I'm keeping my eyes open for other solutions for provisioning. Bright cluster manager is another one I have on my radar.
Computer didn't work for 5 months (it started then after i downloaded skyrim from steam it shut off, then finally worked last month). Put my new graphics card in, then problems ensued. Here: Link to www.reddit.com. 1st step i'd do is remove all nonessential parts from the computer. Leave the cpu and 1 stick of ram. Pull out the graphics card, don't connect any hard drives or cd drives. On the back, connect the monitor to the on board video card and connect the keyboard. Does it power on? Do you get any error messages other than it saying there is no OS? Then power down and connect things one by one until you figure out what part is causing the problem. If you think it's the drivers, you can boot into safe mode (i hope windows 8 still has that, press f8 while booting), then run Driver Sweeper, to remove the graphics drivers. I haven't tried this on windows 8 so i'm not sure if it will run or not. I don't think you need to do a full format and reinstall.
I'll try this tomorrow after work for sure. Do you reddit enough that i could contact you for more advice for help if i run into anything else? (i did contact nvidia team for help, they just told me to delete old drivers without any other help then those words). I don't blame you if you don't want to say you are able to help me with this situation. Humans be humans. Was there a specific reason to go into a cancer research lab? Or was it just a job that came around? No I don't go into photoshopbattles. I pretty much just do what I need for websites and that's it.
How do you like your baked potatoes? (please get into specific detail). It just happened to be the job I found but I love the environment. Much different than a corporate job.
I'm not a fan of baked potatoes but I do love curly fries if that counts for something.
You should really join us in the BAPC IRC channel. I do hang out in the unofficial Overclockers.com irc channel quite a bit. I'll try to drop by.
Do you do any sort of automation for firmware updates? Firmware automation? Nope, and I don't think I'd ever want such a thing. I've been looking at puppet as a way to automatically update software though.
I saw below you guys have some Dell servers, what models and do you use their Lifecycle Controller? We have a couple r610 servers and an equallogic storage box. I haven't heard of this life cycle controller.
What are the specs of your personal rig? Intel i7 3770k @ 4ghz.
Zalman CNPS9900LED cooler.
Patriot ddr3 2x2gb @ 800mhz cas7 (rated for 1200mhz cas9 but I can't boot at that speed anymore for some reason)
MSI Z77A-G41.
ATI Radeon HD 6870.
OCZ Revodrive X2.
How come you have a 3770k but only 4GB of RAM and a 6870? Seems a little overpowered in the CPU category. For benchmarking, mainly. The 3770k was our standard platform for reviews when I bought it. The rest is leftovers from various reviews. We don't get paid, so basically we work for hardware when we write reviews, more or less.
Wait when you review hardware you get stuff? Yes, hardware vendors provide review samples.
Have you ever had an OEM send you equipment different from the consumer version? (Say a factory overclocked version) and claiming it was the standard. Nope. Even if they did, we'd certainly review it as the hardware is, not as they intended it to be.
What's the worst PC loadout you've ever seen? PC load letter? What the f does that mean?
[email protected] JK, doesn't work well on a cluster unfortunately. Unless you have any perls of wisdom on how to make it work on a cluster? Well, it would work just as it does on any other group of computers. I'd have to run one client on each computer and they'd all check back to get their own workloads, so it would really take out the "cluster" usage and turn them just into regular blade servers.
How old are you? Young 30s.
Have you gone to college and completed a bachelor's degree, if not, do you regret it? Yes, BS in Mechanical Engineering.
How did you prove yourself to be worthy of that initial Jr. Sys. Admin job? I listed everything I could think of that I've done that was computesysadmin related. I had administered several web servers over the years, and experimented with many different distributions as my daily driver on my main desktop, so I was very comfortable on the command line and with day to day tasks. I was asked a few 'test' questions on the interview but I think they were more to gauge exactly what i did and didn't have experience with, not so much to make or break me.
Lastly, congrats on doing what you love for a living. Cheers to your future. And thanks. i definitely wake up in the morning with a different attitude than i used to, and that makes a big difference.
Configuration Management / Vagrant / Clouds. I have start playing with configuration management, but haven't gotten anything in production yet. I only provision new VMs every once in a while, and once the computer nodes are up they are pretty stable.
What is your scripting language of choice? I use straight up bash for most things, and python for some. I'm trying to learn more python.
How do you feel about some distros moving away from init.d and going to systemd? I like init.d because it's what I know. Systemd is just a different way of doing things, I'm sure I'll like it once I learn it.
As a OCF Member I have to ask, What is the most extreme cooling you have dealt with?(LN2, Phase Change, Water, D-Ice, etc.) LN2, at the benching party in philly last year. We definitely need to get one of those on schedule again. Also, my work has LN2 and D-ice sitting around but I haven't asked if it's ok for me to play with those yet. One day, i'll ask, and it will be awesome if they say yes. fingers crossed.
So, can I have some of your left over gear? Joking, heh heh... Seriously though, got any gear that's collecting dust? Mostly by the time we're ready to part with gear, it's not worth much and is terribly outdated. Or, it's been burned up by pushing too many volts.
What do you do with the old gear? Do you scrap up a functional computer and donate it to a charity, or just proper e-waste recycling? If it's not on my computer or benching station, it's in my closet. And my wife doesn't like the amount of computer stuff in my closet, so I'm sure I'll start looking for some way to recycle stuff soon.
Where does a young grasshopper starts to learn all of these materials wise one? Well, you could get yourself a RHCSA prep book (linked to the one i have and found useful) and go through all of the exercises. The way I learned was basically to set up my own servers, either physical or virtual, at home, and run them. I think FreeBSD, Gentoo, and Slackware were the most beneficial to me in that they don't really make choices for you, so you have to configure things for yourself which forces you to read the documentation and learn. They all have excellent documentation, btw. If you want to go a step further, linux from scratch will really teach you about the operating system from the ground up.
From there, come up with little projects for yourself. Like making a home NAS, setup NFS and Samba shares, install XBMC on a HTPC and hook it up to your tv to stream movies and music. Setup a webserver and owncloud. Stuff like that.
Sorry I'm late but... how old were you when you first starting tinkering with Linux and such? I'd like to be a sysadmin or similar when I finish school so I figured you were the right person to ask. I was 19 when I first made that half life/counterstrike server. I didn't even know what ssh was and it took a good amount of explaining for me to finally understand. The freebsd documentation is amazing and will walk you through just about everything step by step. To get NAT configured I had to use another how to but setting up that server taught me a ton.
Are you an Nvidia or an AMD guy? It's changed several times over the years. I used to be solely Nvidia because of Linux, but AMD has been stepping up their game and getting their drivers usable, so I currently run all AMD.
How much of a PITA is it for you to be HIPAA compliant? It's not really that tough. Luckily there's only a couple projects going on right now that have special needs above and beyond regular security needs.
What do you use for storage? We have a few Jetstor SANs, a couple Promise RAID boxes, and an Equallogic box as our VMWare backend. But our main mass storage is Isilon X200.
Whoops my bad, meant 1.18 not 1.8 it'd be gone if it was 1.8. sorry. I am using a hyper 212 EVO in the standard push configuration. Well 1.18 is too low for 4.4ghz.
Only 4gigs of ram in your rig ? Yeah...I've got 16 in my work PC for running VMs, and 16 in my VM host at home too. I'll probably buy more soon.
Oh ok, what V would I go to? I was able to initially get 4.4 with 1.18 and 0 whea errors, what V would you recommend? This is my first oc btw. Bump it up one step at a time until you are stable. Be methodical about it. You can check out what values other people are getting on hwbot.org.
Ok Ill do that, thanks man, at what V if the errors dont go away should I stop advancing them? Most likely you will want to stay around 1.6v. I'm not very familiar with that chip specifically so I'd check hwbot to see what other people have posted and go by that. Obviously remember that not all chips are the same, so you can't expect to get exactly what other people get.
1.6, that seems a bit high for my 212 EVO, a few days ago I did have it at 1.18 without any WHEA 20 errors. That's why I'm saying take it slow, one step at a time.
What do you think of this quote by Richard J. Schwartz? "The impact of nanotechnology is expected to exceed the impact the electronics revolution has had on our lives." Sounds good to me. I can't wait to see what comes next.
Actually nodes, or are some of them VMs? Physical blade servers as nodes. with 144 GB ram each.
Zfsonlinux in use? No I haven't used zfs at all.
Hey... You're pretty cool. Thanks. You're not too bad yourself.
The answer should be ''i wish i could say the same to you'' I'm not like that.
Just how big is your hpc. Only 18 nodes :/ but its more what I do with it...
How'd you get your nickname. Back when I played CS in the dorm freshman year of college, I used to get killed all the time. So I started calling myself "jack splat", as a play on the nursery rhyme (jack sprat), then shortened it to 'splat' on most of the websites I signed up for.
Describe a SHTF moment at your work place. I can imagine it must be highly stressful being the sole responsible person to keep all that gear running. I definitely have a few and luckily they aren't that bad. One of my first few months, I decided to connect this wireless ap to the network to test it out one morning. As I was being awesome managing the cable to make it look clean, one of the security guards came into the server room and said they had no internet. I looked at our switches and they were all lit up solid. By hooking up the ap, which had spanning tree turned on, I took down the network of the entire building.
Ouch...that's definitely a SHTF moment. glad you came out unscathed. Luckily, all I had to do was unplug it and everything went back to normal. I then set up a spare switch at my desk and played with it before figuring out that STP needed to be disabled on the AP. Now it's been running for over a year without incident.
Would you rather fight 100 duck sized horses or 1 horse sized duck? I'd go for the horse sized duck. Seems like more of a challenge.
U mad? Nah, I'm feeling pretty good today.
Last updated: 2013-08-18 07:16 UTC
This post was generated by a robot! Send all complaints to epsy.
submitted by tabledresser to tabled [link] [comments]

Quadro P600 Review And Bitcoin Mining With Nicehash Minergate And Betterhash How Much Can You Make Mining Bitcoin With 6X 1080 Ti ... An ALTERNATIVE To GPU Cryptocurrency Mining? CPU Mining! The Truth About Used Mining Cards... - YouTube Nvidia Said We Couldn't Game On This Crypto Mining Card ...

These cards were best for mining Ethhash algo ( Etherum,Ubiq) but now you can’t buy them at decent price.The price has been hiked for these cards and now to buy them you have to pay 50% premium which kills the deal. Other option available for mininig is Nvidia Cards. We have GTX 1060, GTX 1070, GTX 1080 ,GTX 1080 Ti. They are not, at least not anymore. Bitcoin mining benefits a lot from multiple processing pathways - it's an "embarrassingly parallel" problem, where you just need to divide up the space you want to search, and then every worker can work independ... Secondly, another difference favoring Bitcoin mining on AMD GPUs instead of Nvidia's is that the mining algorithm is based on SHA-256, which makes heavy use of the 32-bit integer right rotate operation. This operation can be implemented as a single hardware instruction on AMD GPUs (BIT_ALIGN_INT), but requires three separate hardware instructions to be emulated on Nvidia GPUs (2 shifts + 1 add ... The first reason AMD cards outperform their Nvidia counterparts in BTC mining (and the current Bitcoin entry does cover this) is because the SHA-256 algorithm utilizes a 32-bit integer right ... Does Bitcoin mining damage your GPU? Plugging in a smaller gauge wire can actually cause an overdraw, which can potentially lead to the burning out of GPU. Even though this scenario is highly unlikely, there have been some cases when thermal throttling simply didn’t work. If you do not want your card to burn up, we also recommend lowering your power target.

[index] [39049] [1040] [2678] [9778] [5029] [24091] [4018] [43205] [35741] [18464]

Quadro P600 Review And Bitcoin Mining With Nicehash Minergate And Betterhash

Graphics card prices have been spiraling out of control due to the Bitcoin cryptocurrency craze and we've been looking for alternatives. AMD's Threadripper 1950X may just be the CPU mining ... a short video explaining the differences between nvidia and amd gpu mining. Sign up with coinbase. buy or sell 100 dollars in crypto currency and get 10 dollars of bitcoin for free with this link ... How Much Can You Make Mining Bitcoin On Nvidia GTX Titan X - Duration: 7:48. ... BEST To Buy Top 5 NVIDIA Cards For GPU Mining Ethereum, Zcash and Monero - Duration: 5:36. BuriedONE 160,353 views ... Click here to join The Bitcoin Mine!: http://www.thebitcoinmine.triplemining.com Link to stable 32 bit version of BFGMiner: http://luke.dashjr.org/programs/b... SUBSCRIBE FOR MORE HOW MUCH - http://shorturl.at/arBHL Nviddia GTX 1080 Ti - https://amzn.to/2Hiw5xp 6X GPU Mining Rig Case - https://bitcoinmerch.com/produc...

#