Bitcoin price soars above $9 for the first time in almost
Bitcoin price soars above $9 for the first time in almost
If you’d bought $1,000 of Bitcoin in 2010, you’d be worth
Drop Gold and The Myths We’re Told - Goldmoney
Bitcoin: IRS Takes On The Crooks—And The Good Guys
How the Electric Vehicle Industry Could Drive
Bitcoin 11 Years - Achievements, Lies, and Bullshit Claims So Far - Tooootally NOT a SCAM !!!!
That's right folks, it's that time again for the annual review of how Bitcoin is going: all of those claims, predictions, promises .... how many have turned out to be true, and how many are completely bogus ??? Please post / link this on Bitcoin (I am banned there for speaking the truth, so I cannot do it) ... because it'a way past time those poor clueless mushrooms were exposed to the truth. Anyway, without further ado, I give you the Bitcoin's Achievements, Lies, and Bullshit Claims So Far ... . Bitcoin Achievements so far:
It has spawned a cesspool of scams (2000+ shit coin scams, plus 100's of other scams, frauds, cons).
Many 1,000's of hacks, thefts, losses.
Illegal Use Cases: illegal drugs, illegal weapons, tax fraud, money laundering, sex trafficking, child pornography, hit men / murder-for-hire, ransomware, blackmail, extortion, and various other kinds of fraud and illicit activity.
Legal Use Cases: Steam Games, Reddit, Expedia, Stripe, Starbucks, 1000's of merchants, cryptocurrency conferences, Ummm ????? The few merchants who "accept Bitcoin" immediately convert it into FIAT after the sale, or require you to sell your coins to BitPay or Coinbase for real money, and will then take that money. Some of the few who actually accept bitcoin haven't seen a customer who needed to pay with bitcoin for the last six months, and their cashiers no longer know how to handle that.
Contributing significantly to Global Warming.
Wastes vasts amounts of electricity on useless, do nothing work.
Exponentially raises electricity prices when big miners move into regions where electricity was cheap.
It’s the first "currency" that is not self-sustainable. It operates at a net loss, and requires continuous outside capital to replace the capital removed by miners to pay their costs. It’s literally a "black hole currency."
It created a new way for people living too far from Vegas to gamble all their life savings away.
Spawned "blockchain technology", a powerful technique that lets incompetent programmers who know almost nothing about databases, finance, programming, or blockchain scam millions out of gullible VC investors, banks, and governments.
Increased China's foreign trade balance by a couple billion dollars per year.
Helped the FBI and other law enforcement agents easily track down hundreds of drug traffickers and drug users.
Wasted thousands if not millions of man-hours of government employees and legislators, in mostly fruitless attempts to understand, legitimize, and regulate the "phenomenon", and to investigate and prosecute its scams.
Rekindled the hopes of anarcho-capitalists and libertarians for a global economic collapse, that would finally bring forth their Mad Max "utopia".
Added another character to Unicode (no, no, not the "poo" 💩 character ... that was my first guess as well 🤣)
Provides an easy way for malware and ransomware criminals to ply their trade and extort hospitals, schools, local councils, businesses, utilities, as well as the general population.
~~Bitcoin is "striking fear into the hearts of bankers, precisely because Bitcoin eliminates the need for banks. ~~, Mark Yusko, billionaire investor and Founder of Morgan Creek Capital, https://www.bitcoinprice.com/predictions/
"A bitcoin miner in every device and in every hand."
"All the indicators are pointing to a huge year and bigger than anything we have seen before."
"Bitcoin is communism and democracy working hand in hand."
"Bitcoin is freedom, and we will soon be free."
"Bitcoin isn't calculated risk, you're right. It's downright and painfully obvious that it will consume global finance."
"Bitcoin most disruptive technology of last 500 years"
"Bitcoin: So easy, your grandma can use it!"
"Creating a 4th Branch of Government - Bitcoin"
"Future generations will cry laughing reading all the negativity and insanity vomited by these permabears."
"Future us will thank us."
"Give Bitcoin two years"
"HODLING is more like being a dutiful guardian of the most powerful economic force this planet has ever seen and getting to have a say about how that force is unleashed."
"Cut out the middleman"
"full control of your own assets"
"reduction in wealth gap"
"cannot print money out of thin air"
"Why that matters? Because blockchain not only cheaper for them, it'll be cheaper for you and everyone as well."
"If you are in this to get rich in Fiat then no. But if you are in this to protect your wealth once the current monetary system collapse then you are protected and you'll be the new rich."
"Theres the 1% and then theres the 99%. You want to be with the rest thats fine. Being different and brave is far more rewarding. No matter your background or education."
"NO COINERS will believe anything they are fed by fake news and paid media."
"I know that feeling (like people looking at you as in seeing a celebrity and then asking things they don't believe until their impressed)."
"I literally walk round everyday looking at other people wondering why they even bother to live if they don't have Bitcoin in their lives."
"I think bitcoin may very well be the best form of money we’ve ever seen in the history of civilization."
"I think Bitcoin will do for mankind what the sun did for life on earth."
"I think the constant scams and illegal activities only show the viability of bitcoin."
"I think we're sitting on the verge of exponential interest in the currency."
"I'm not using hyperbole when I say Satoshi found the elusive key to World Peace."
"If Jesus ever comes back you know he's gonna be using Bitcoin"
"If this idea was implemented with The Blockchain™, it would be completely flawless! Flawless I tell you!"
"If you're the minimum wage guy type, now is a great time to skip food and go full ramadan in order to buy bitcoin instead."
"In a world slipping more and more into chaos and uncertainty, Bitcoin seems to me like the last solid rock defeating all the attacks."
"In this moment, I am euphoric. Not because of any filthy statist's blessing, but because I am enlightened by own intelligence."
"Is Bitcoin at this point, with all the potential that opens up, the most undervalued asset ever?"
"It won't be long until bitcoin is an everyday household term."
"It's the USD that is volatile. Bitcoin is the real neutral currency."
"Just like the early Internet!"
"Just like the Trojan Horse of old, Bitcoin will reveal its full power and nature"
"Ladies if your man doesnt have some bitcoin then he cant handle anything and has no danger sex appeal. He isnt edgy"
"let me be the first to say if you dont have bitcoin you are a pussy and cant really purchase anything worldwide. You have no global reach"
"My conclusion is that I see this a a very good thing for bitcoin and for users"
"No one would do such a thing; it'd be against their self interests."
"Ooh lala, good job on bashing Bitcoin. How to disrespect a great innovation."
"Realistically I think Bitcoin will replace the dollar in the next 10-15 years."
"Seperation of money and state -> states become obsolete -> world peace."
"Some striking similarities between Bitcoin and God"
"THANK YOU. Better for this child to be strangled in its crib as a true weapon for crypto-anarchists than for it to be wielded by toxic individuals who distort the technology and surrender it to government and corporate powers."
"The Blockchain is more encompassing than the internet and is the next phase in human evolution. To avoid its significance is complete ignorance."
"The bull run should begin any day now."
"The free market doesn't permit fraud and theft."
"The free market will clear away the bad actors."
"The only regulation we need is the blockchain."
"We are not your slaves! We are free bodies who will swallow you and puke you out in disgust. Welcome to liberty land or as that genius called it: Bitcoin."
"We do not need the bankers for Satoshi is our saviour!"
"We have never seen something so perfect"
"We must bring freedom and crypto to the masses, to the common man who does not know how to fight for himself."
"We verified that against the blockchain."
"we will see a Rennaisnce over the next few decades, all thanks to Bitcoin."
"Well, since 2006, there has been a infinite% increase in price, so..."
"What doesn't kill cryptocurrency makes it stronger."
"When Bitcoin awake in normally people (real people) ... you will have this result : No War. No Tax. No QE. No Bank."
"When I see news that the price of bitcoin has tanked (and thus the market, more or less) I actually, for-real, have the gut reaction "oh that’s cool, I’ll be buying cheap this week". I never knew I could be so rational."
"Where is your sense of adventure? Bitcoin is the future. Set aside your fears and leave easier at the doorstep."
"Yes Bitcoin will cause the greatest redistribution of wealth this planet has ever seen. FACT from the future."
"You are the true Bitcoin pioneers and with your help we have imprinted Bitcoin in the Canadian conscience."
"You ever try LSD? Perhaps it would help you break free from the box of state-formed thinking you have limited yourself..."
"Your phone or refrigerator might be on the blockchain one day."
The banks can print money whenever they way, out of thin air, so why can't crypto do the same ???
Central Banks can print money whenever they way, out of thin air, without any consequences or accounting, so why can't crypto do the same ???
It's impossible to hide illegal, unsavory material on the blockchain
It's impossible to hide child pornography on the blockchain
All Bitccoins are the same, 100% identical, one Bitcoin cannot be distinguished from any other Bitcoin.
The price of Bitcoin can only go up because of scarcity / 21 million coin limit. (Bitcoin is open source, anyone can create thir own copy, and there are more than 2,000+ Bitcoin copies / clones out there already).
immune to government regulation
"a world-changing technology"
"a long-term store of value, like gold or silver"
"To Complex to Be Audited."
"Old Auditing rules do not apply to Blockchain."
"Old Auditing rules do not apply to Cryptocurrency."
Bitcoin now at $16,600.00. Those of you in the old school who believe this is a bubble simply have not understood the new mathematics of the Blockchain, or you did not cared enough to try. Bubbles are mathematically impossible in this new paradigm. So are corrections and all else", John McAfee, 7 Dec 2017 @ 5:09 PM,https://mobile.twitter.com/officialmcafee/status/938938539282190337
2013-11-27: ""What is a Citadel?" you might wonder. Well, by the time Bitcoin became worth 1,000 dollar [27-Nov-2013], services began to emerge for the "Bitcoin rich" to protect themselves as well as their wealth. It started with expensive safes, then began to include bodyguards, and today, "earlies" (our term for early adapters), as well as those rich whose wealth survived the "transition" live in isolated gated cities called Citadels, where most work is automated. Most such Citadels are born out of the fortification used to protect places where Bitcoin mining machines are located. The company known as ASICminer to you is known to me as a city where Mr. Friedman rules as a king.", u/Luka_Magnotta, aka time traveler from the future, 31-Aug-2013, https://www.reddit.com/Bitcoin/comments/1lfobc/i_am_a_timetraveler_from_the_future_here_to_beg/
2018-12: Listen up you giggling cunts... who wants some?...you? you want some?...huh? Do ya? Here's the deal you fuckin Nerds - Butts are gonna be at30 grandor more by next Christmas  - If they aren't I will publicly administer an electronic dick sucking to every shill on this site and disappear forever - Until then, no more bans or shadow bans - Do we have a deal? If Butts are over 50 grand me and Lammy get to be mods. Deal? Your ole pal - "Skully"u/10GDeathBoner, 3-Feb-2018 https://www.reddit.com/Buttcoin/comments/7ut1ut/listen_up_you_giggling_cunts_who_wants_someyou/
2018-12: "Bitcoin could be at$40,000by the end of 2018, it really easily could", Mike Novogratz, a former Goldman Sachs Group Inc. partner, ex-hedge fund manager of the Fortress Investment Group and a longstanding advocate of cryptocurrency, 21-Sep-2018, https://www.youtube.com/watch?v=6lC1anDg2KU
2018-12: Bitcoin will end 2018 at the price point of$50,000, Ran Neuner, host of CNBC’s show Cryptotrader and the 28th most influential Blockchain insider according to Richtopia,https://www.bitcoinprice.com/predictions/
The date was June 10, 2018. The sun was shining, the grass was growing, and the birds were singing. At least, that’s what I assumed. Being a video game and tech obsessed teenager, I was indoors, my eyes glued to my computer monitor like a starving lion spying on a plump gazelle. I was watching the E3 (Electronic Entertainment Expo) 2018 broadcast on twitch.com, a popular streaming website. Video game developers use E3 as an annual opportunity to showcase any upcoming video game projects to the public. So far, the turnout had been disappointing. Much to my disappointment, multiple game developers failed to unveil anything of actual sustenance for an entire two hours. A graphical update here, a bug fix there. Issues that should have been fixed at every game’s initial launch, not a few months after release. Feeling hopeless, I averted my eyes from my computer monitor to check Reddit (a social media app/website) if there were any forum posts that I had yet to see. But then, I heard it. The sound of music composer Mick Gordon’s take on the original “DooM” theme, the awesome combination of metal and electronic music. I looked up at my screen and gasped. Bethesda Softworks and id software had just announced “DOOM: Eternal”, the fifth addition in the “DooM” video game series. “DOOM: Eternal” creative director Hugo Martin promised that the game would feel more powerful than it’s 2016 predecessor, there would be twice as many enemy types, and the doom community would finally get to see “hell on earth”. (Martin) As a fan of “DOOM (2016)”, I was ecstatic. The original “DooM” popularized the “First Person Shooter (FPS)” genre, and I wished I wouldn’t have to wait to experience the most recent entry in the series. “DOOM(1993)” was a graphical landmark when it originally released, yet nowadays it looks extremely dated, especially compared to “DOOM: Eternal”. What advancements in computer technology perpetuated this graphical change? Computers became faster, digital storage increased, and computer peripherals were able to display higher resolution and refresh rates. “DooM” 1993 graphics example: 📷(Doom | Doom Wiki) “DOOM: Eternal” graphics example: 📷 (Bailey) In their video “Evolution Of DOOM”, the video game YouTube Channel “gameranx” says that on December 10, 1993, a file titled “DOOM1_0.zip” was uploaded on the File Transfer Protocol (FTP) server of the University of Wisconsin. This file, two megabytes in size, contained the video game “DooM” created by the game development group “id Software”. (Evolution of DOOM) While not the first game in the “First Person Shooter” (FPS) genre, “DooM” popularized the genre, to the point of any other FPS game being referred to as a “Doom Clone” until the late 1990s. (Doom clones | Doom Wiki) The graphics of the original “DooM” is definitely a major downgrade compared to today’s graphical standards, but keep in mind that the minimum system requirements of “DooM”, according to the article “Doom System Requirements” on gamesystemrequirements.com, was eight megabytes of ram, an Intel Pentium or AMD (Advanced Micro Devices) Athlon 486 processor cycling at sixty-six megahertz or more, and an operating system that was Windows 95 or above. (Doom System Requirements) In case you don’t speak the language of technology (although I hope you learn a thing or two at the end of this essay), the speed and storage capacity is laughable compared to the specifications of today. By 1993, the microprocessor, or CPU (Central Processing Unit) had been active for the past twenty-two years after replacing the integrated circuit in 1971, thanks to the creators of the microprocessor, Robert Noyce and Gordon Moore who were also the founder of CPU manufacturer “Intel”. Gordon Moore also created “Moore’s law”, which states “The number of transistors incorporated in a chip will approximately double every 24 months”. (Moore) Sadly, according to writer and computer builder Steve Blank in his article “The End of More - The Death of Moore’s Law”, this law would end at around 2005, thanks to the basic laws of physics. (Blank) 1993 also marked an important landmark for Intel, who just released the first “Pentium” processor which was capable of a base clock of 60 MHz (megahertz). The term “base clock” refers to the default speed of a CPU. This speed can be adjusted via the user’s specifications, and “MHz” refers to one million cycles per second. A cycle is essentially one or more problems that the computer solves. The more cycles the CPU is running at, the more problems get solved. Intel would continue upgrading their “Pentium” lineup until January 4, 2000 when they would release the “Celeron” processor, with a base clock of 533 MHz. Soon after, on June 19, 2000, rival CPU company AMD would release their “Duron” processor which had a base clock of 600 MHz, with a maximum clock of 1.8 GHz (Gigahertz). One GHz is equal to 1,000 MHz. Intel and AMD had established themselves as the two major CPU companies in the 1970s in Silicon Valley. Both companies had been bitter rivals since then, trading figurative blows in the form of competitive releases, discounts, and “one upmanship” to this day. Moving on to April 21, 2005 when AMD released the first dual-core CPU, the “Athlon 64 X2 3800+”. The notable feature of this CPU, besides a 2.0 GHz base clock and a 3.8 maximum clock, was that it was the first CPU to have two cores. A CPU core is a CPU’s processor. The more cores a CPU has, the more tasks it can perform per cycle, thus maximizing it’s efficiency. Intel wouldn’t respond until January 9, 2006, when they released their dual-core processor, the “Core 2 Duo Processor E6320”, with a base clock of 1.86 GHz. (Computer Processor History) According to tech entrepreneur Linus Sebastian in his YouTube videos “10 Years of Gaming PCs: 2009 - 2014 (Part 1)” and “10 Years of Gaming PCs: 2015 - 2019 (Part 2)”, AMD would have the upper hand over Intel until 2011, when Intel released the “Sandy Bridge” CPU microarchitecture, which was faster and around the same price as AMD’s current competing products. (Sebastian) The article “What is Microarchitecture?” on the website Computer Hope defines microarchitecture as “a hardware implementation of an ISA (instruction set architecture). An ISA is a structure of commands and operations used by software to communicate with hardware. A microarchitecture is the hardware circuitry that implements one particular ISA”. (What is Microarchitecture?) Microarchitecture is also referred to as what generation a CPU belongs to. Intel would continue to dominate the high-end CPU market until 2019, when AMD would “dethrone” Intel with their third generation “Ryzen” CPU lineup. The most notable of which being the “Ryzen 3950x”, which had a total of sixteen cores, thirty-two threads, a base clock of 3.5 GHz, and a maximum clock of 4.7 GHz. (Sebastian) The term “thread” refers to splitting one core into virtual cores, via a process known as “simultaneous multithreading”. Simultaneous multithreading allows one core to perform two tasks at once. What CPU your computer has is extremely influential for how fast your computer can run, but for video games and other types of graphics, there is a special type of processor that is designed specifically for the task of “rendering” (displaying) and generating graphics. This processor unit is known as the graphics processing unit, or “GPU”. The term “GPU” wasn’t used until around 1999, when video cards started to evolve beyond the literal generation of two-dimensional graphics and into the generation of three-dimensional graphics. According to user “Olena” in their article “A Brief History of GPU”, The first GPU was the “GeForce 256”, created by GPU company “Nvidia'' in 1999. Nvidia promoted the GeForce 256 as “A single-chip processor with integrated transform, lighting, triangle setup/clipping, and rendering engines that is capable of processing a minimum of 10 million polygons per second”. (Olena) Unlike the evolution of CPUs, the history of GPUs is more one sided, with AMD playing a game of “catchup” ever since Nvidia overtook AMD in the high-end GPU market in 2013. (Sebastian) Fun fact, GPUs aren’t used only for gaming! In 2010, Nvidia collaborated with Audi to power the dashboards and increase the entertainment and navigation systems in Audi’s cars! (Olena) Much to my (and many other tech enthusiasts), GPUs would increase dramatically in price thanks to the “bitcoin mania” around 2017. This was, according to senior editor Tom Warren in his article “Bitcoin Mania is Hurting PC Gamers By Pushing Up GPU Prices'' on theverge.com, around an 80% increase in price for the same GPU due to stock shortages. (Warren) Just for context, Nvidia’s “flagship” gpu in 2017 was the 1080ti, the finest card of the “pascal” microarchitecture. Fun fact, I have this card. The 1080ti launched for $699, with the specifications of a base clock of 1,481 MHz, a maximum clock of 1,582 MHz, and 11 gigabytes of GDDR5X Vram (Memory that is exclusive to the GPU) according to the box it came in. Compare this to Nvidia’s most recent flagship GPU, the 2080ti of Nvidia’s followup “Turing” microarchitecture, another card I have. This GPU launched in 2019 for $1,199. The 2080ti’s specifications, according to the box it came in included a base clock of 1,350 MHz, a maximum clock of 1,545 MHz, and 11 gigabytes of GDDR6 Vram. A major reason why “DooM” was so popular and genius was how id software developer John Carmack managed to “fake” the three-dimensional graphics without taking up too much processing power, hard drive space, or “RAM” (Random access memory), a specific type of digital storage. According to the article “RAM (Random Access Memory) Definition” on the website TechTerms, Ram is also known as “volatile” memory, because it is much faster than normal storage (which at the time took the form of hard-drive space), and unlike normal storage, only holds data when the computer is turned on. A commonly used analogy is that Ram is the computer’s short-term memory, storing temporary files to be used by programs, while hard-drive storage is the computer’s long term memory. (RAM (Random Access Memory) Definition) As I stated earlier, in 1993, “DooM” required 8 megabytes of ram to run. For some context, as of 2020, “DOOM: Eternal” requires a minimum of 8 gigabytes of DDR4 (more on this later) ram to run, with most gaming machines possessing 16 gigabytes of DDR4 ram. According to tech journalist Scott Thornton in his article “What is DDR (Double Data Rate) Memory and SDRAM Memory”, in 1993, the popular format of ram was “SDRAM”. “SDRAM” stands for “Synchronous Dynamic Random Access Memory”. SDRAM differs from its predecessor, “DRAM” (Dynamic Random Access Memory) by being synchronized with the clock speed of the CPU. DRAM was asynchronous (not synchronized by any external influence), which “posted a problem in organizing data as it comes in so it can be queued for the process it’s associated with”. SDRAM was able to transfer data one time per clock cycle, and it’s replacement in the early 2000s, “DDR SDRAM” (Dual Data Rate Synchronous Dynamic Random Access Memory) was able to transfer data two times per clock cycle. This evolution of ram would continue to this day. In 2003, DDR2 SDRAM was released, able to transfer four pieces of data per clock cycle. In 2007, DDR3 SDRAM was able to transfer eight pieces of data per clock cycle. In 2014, DDR4 SDRAM still was able to transfer eight pieces of data per cycle, but the clock speed had increased by 600 MHz, and the overall power consumption had been reduced from 3.3 volts for the original SDRAM to 1.2 volts for DDR4. (Thornton)The digital size of each “ram stick” (a physical stick of ram that you would insert into your computer) had also increased, from around two megabytes per stick, to up to 128 gigabytes per stick (although this particular option will cost you around $1,000 per stick depending on the manufacturer) in 2020, although the average stick size is 8 gigabytes. For the average computer nowadays, you can insert up to four ram sticks, although for more high-end systems, you can insert up to sixteen or even thirty-two! Rewind back to 1993, where the original “DooM” took up two megabytes of storage, not to be confused with ram. According to tech enthusiast Rex Farrance in their article “Timeline: 50 Years of Hard Drives”, the average computer at this time had around two gigabytes of storage. Storage took the form of magnetic-optical discs, a combination of the previous magnetic discs and optical discs. (Farrance) This format of storage is still in use today, although mainly for large amounts of rarely used data, while data that is commonly used by programs (including the operating system) is put on solid-state drives, or SSDs. According to tech journalist Keith Foote in their article “A Brief History of Data Storage”, SSDs differed from the HDD by being much faster and smaller, storing data on a flash memory chip, not unlike a USB thumb drive. While SSDs had been used as far back as 1950, they wouldn’t find their way into the average gaming machine until the early 2010s. (Foote) A way to think about SSDs is common knowledge. It doesn’t contain every piece of information you know, it just contains what you use on a daily basis. For example, my computer has around 750 gigabytes of storage in SSDs, and around two terabytes of internal HDD storage. On my SSDs, I have my operating system, my favorite programs and games, and any files that I use frequently. On my HDD, I have everything else that I don’t use on a regular basis. “DOOM: Eternal” would release on March 20, 2020, four months after it’s original release date on November 22, 2019. And let me tell you, I was excited. The second my clock turned from 11:59 P.M. to 12:00 A.M., I repeatedly clicked my refresh button, desperately waiting to see the words “Coming March 20” transform into the ever so beautiful and elegant phrase: “Download Now”. At this point in time, I had a monitor that was capable of displaying roughly two-million pixels spread out over it’s 27 inch display panel, at a rate of 240 times a second. Speaking of monitors and displays, according to the article “The Evolution of the Monitor” on the website PCR, at the time of the original “DooM” release, the average monitor was either a CRT (cathode ray tube) monitor, or the newer (and more expensive) LCD (liquid crystal display) monitor. The CRT monitor was first unveiled in 1897 by the German physicist Karl Ferdinand Braun. CRT monitors functioned by colored cathode ray tubes generating an image on a phosphorescent screen. These monitors would have an average resolution of 800 by 600 pixels and a refresh rate of around 30 frames per second. CRT monitors would eventually be replaced by LCD monitors in the late 2000s. LCD monitors functioned by using two pieces of polarized glass with liquid crystal between them. A backlight would shine through the first piece of polarized glass (also known as substrate). Electrical currents would then cause the liquid crystals to adjust how much light passes through to the second substrate, which creates the images that are displayed. (The Evolution of the Monitor) The average resolution would increase to 1920x1080 pixels and the refresh rate would increase to 60 frames a second around 2010. Nowadays, there are high end monitors that are capable of displaying up to 7,680 by 4,320 pixels, and also monitors that are capable of displaying up to 360 frames per second, assuming you have around $1,000 lying around. At long last, it had finished. My 40.02 gigabyte download of “DOOM: Eternal” had finally completed, and oh boy, I was ready to experience this. I ran over to my computer, my beautiful creation sporting 32 gigs of DDR4 ram, an AMD Ryzen 7 “3800x” with a base clock of 3.8 GHz, an Nvidia 2080ti, 750 gigabytes of SSD storage and two terabytes of HDD storage. Finally, after two years of waiting for this, I grabbed my mouse, and moved my cursor over that gorgeous button titled “Launch DOOM: Eternal”. Thanks to multiple advancements in the speed of CPUs, the size of ram and storage, and display resolution and refresh rate, “DooM” had evolved from an archaic, pixelated video game in 1993 into the beautiful, realistic and smooth video game it is today. And personally, I can’t wait to see what the future has in store for us.
Georgia Tombstones (Part 2) by Jayge 8^J "Project Blue Beam is a conspiracy theory that claims that NASA is attempting to implement a New Age religion with the Antichrist at its head and start a New World Order, via a technologically-simulated Second Coming. The allegations were presented in 1994 by Quebecois journalist and conspiracy theorist Serge Monast, and later published in his book Project Blue Beam (NASA). Proponents of the theory allege that Monast and another unnamed journalist, who both died of heart attacks in 1996, were in fact assassinated, and that the Canadian government kidnapped Monast's daughter in an effort to dissuade him from investigating Project Blue Beam. The project was apparently supposed to be implemented in 1983, but it didn't happen. It was then set for implementation in 1995 and then 1996. Monast thought Project Blue Beam would be brought to fruition by the year 2000, really, definitely, for sure. The theory is widely popular (for a conspiracy theory) on the Internet, with many web pages dedicated to the subject, and countless YouTube videos explaining it. The actual source material, however, is very thin indeed. Monast lectured on the theory in the mid-1990s (a transcript of one such lecture is widely available), before writing and publishing his book, which has not been reissued by his current publisher and is all but unobtainable. The currently available pages and videos all appear to trace back to four documents: A transcript of the 1994 lecture by Monast, translated into English. A GeoCities page written by David Openheimer and which appears to draw on the original book. A page on educate-yourself.org, compiled in 2005, which appears to include a translation of the book from the French. Monast's page in French Wikipedia. The French Wikipedia article is largely sourced from two books on conspiracy theories and extremism by Pierre-André Taguieff, a mainstream academic expert on racist and extremist groups. From these few texts have come a flood of green ink, in text and video form, in several languages. Even the French language material typically does not cite the original book but the English language pages on educate-yourself.org. However, conspiracy theorists seem to use quantity as a measure of substance (much as alternative medicine uses appeal to tradition) and never mind the extremely few sources it all traces back to. Proponents of the theory have extrapolated it to embrace HAARP, 9/11, the Norwegian Spiral, chemtrails, FEMA concentration camps and Tupac Shakur. Everything is part of Project Blue Beam. It's well on its way to becoming the Unified Conspiracy Theory. Behold A Pale Horse, William Cooper's 1991 green ink magnum opus, has lately been considered a prior claim of, hence supporting evidence for, Blue Beam by advocates. The book is where a vast quantity of now-common conspiracy memes actually came from, so retrospectively claiming it as prior evidence is somewhere between cherrypicking and the Texas sharpshooter fallacy. However, the following quotes, from pages 180-181, intersect slightly with the specific themes of Blue Beam: It is true that without the population or the bomb problem the elect would use some other excuse to bring about the New World Order. They have plans to bring about things like earthquakes, war, the Messiah, an extra-terrestrial landing, and economic collapse. They might bring about all of these things just to make damn sure that it does work. They will do whatever is necessary to succeed. The Illuminati has all the bases covered and you are going to have to be on your toes to make it through the coming years. Can you imagine what will happen if Los Angeles is hit with a 9.0 quake, New York City is destroyed by a terrorist-planted atomic bomb, World War III breaks out in the Middle East, the banks and the stock markets collapse, Extraterrestrials land on the White House lawn, food disappears from the markets, some people disappear, the Messiah presents himself to the world, and all in a very short period of time? Can you imagine? The world power structure can, and will if necessary, make some or all of those things happen to bring about the New World Order. “Without a universal belief in the new age religion, the success of the new world order will be impossible!” The alleged purpose of Project Blue Beam is to bring about a global New Age religion, which is seen as a core requirement for the New World Order's dictatorship to be realised. There's nothing new in thinking of religion as a form of control, but the existence of multiple religions, spin-off cults, competing sects and atheists suggest that controlling the population entirely through a single religion isn't particularly easy. Past attempts have required mechanisms of totalitarianism such as the Inquisition. Monast's theory, however, suggests using sufficiently advanced technology to trick people into believing. Of course, the plan would have to assume that people could never fathom the trick at all — something contested by anyone sane enough not to swallow this particular conspiracy. The primary claimed perpetrator of Project Blue Beam is NASA, presented as a large and mostly faceless organization that can readily absorb such frankly odd accusations, aided by the United Nations, another old-time boogeyman of conspiracy theorists. According to Monast, the project has four steps: Step One requires the breakdown of all archaeological knowledge. This will apparently be accomplished by faking earthquakes at precise locations around the planet. Fake "new discoveries" at these locations "will finally explain to all people the error of all fundamental religious doctrines", specifically Christian and Muslim doctrines. This makes some degree of sense — if you want to usurp a current way of thinking you need to completely destroy it before putting forward your own. However, religious belief is notoriously resilient to things like facts. The Shroud of Turin is a famous example that is still believed by many to be a genuine shroud of Jesus as opposed to the medieval forgery that it has been conclusively shown to be. Prayer studies, too, show how difficult it is to shift religious conviction with mere observational fact — indeed, many theologians avoid making falsifiable claims or place belief somewhere specifically beyond observation to aid this. So what finds could possibly fundamentally destroy both Christianity and Islam, almost overnight, and universally all over the globe? Probably nothing. Yet, this is only step one of an increasingly ludicrous set of events that Project Blue Beam predicts will occur. Step Two involves a gigantic "space show" wherein three-dimensional holographic laser projections will be beamed all over the planet — and this is where Blue Beam really takes off. The projections will take the shape of whatever deity is most predominant, and will speak in all languages. At the end of this light show, the gods will all merge into one god, the Antichrist. This is a rather baffling plan as it seems to assume people will think this is actually their god, rather than the more natural twenty-first century assumption that it is a particularly opaque Coca Cola advertisement. Evidence commonly advanced for this is a supposed plan to project the face of Allah, despite its contradiction with Muslim belief of God's uniqueness, over Baghdad in 1991 to tell the Iraqis to overthrow Saddam Hussein. Someone, somewhere, must have thought those primitive, ignorant non-Western savages wouldn't have had television or advertising, and would never guess it was being done with mirrors. In general, pretty much anything that either a) involves light or b) has been seen in the sky has been put forward as evidence that Project Blue Beam is real, and such things are "tests" of the technology — namely unidentified flying objects. Existing display technology such as 3D projection mapping and holograms are put forward as foreshadowing the great light show in the sky. This stage will apparently be accomplished with the aid of a Soviet computer that will be fed "with the minute physio-psychological particulars based on their studies of the anatomy and electro-mechanical composition of the human body, and the studies of the electrical, chemical and biological properties of the human brain", and every human has been allocated a unique radio wavelength. The computers are also capable of inducing suicidal thoughts. The Soviets are (not "were") the "New World Order" people. Why NASA would use a Soviet computer when the USSR had to import or copy much of its computer technology from the West is not detailed. The second part of Step Two happens when the holograms result in the dissolution of social and religious order, "setting loose millions of programmed religious fanatics through demonic possession on a scale never witnessed before." The United Nations plans to use Beethoven's "Ode to Joy" as the anthem for the introduction of the new age one world religion. There is relatively little to debunk in this, the most widely remembered section of the Project Blue Beam conspiracy, as the idea is so infeasible. Citing actual existing communication technology is odd if the point is for the end product to appear magical, rather than just as cheap laser projections onto clouds. This hasn't stopped some very strange conspiracy theories about such things popping up. Indeed, the notion of gods being projected into the sky was floated in 1991 by conspiracy theorist Betty J. Mills. And US general (and CIA shyster extraordinaire), Edward Lansdale, actually floated a plan to fake a Second Coming over Cuba to get rid of Castro. Step Three is "Telepathic Electronic Two-Way Communication." It involves making people think their god is speaking to them through telepathy, projected into the head of each person individually using extreme low frequency radio waves. (Atheists will presumably hear an absence of Richard Dawkins.) The book goes to some lengths to describe how this would be feasible, including a claim that ELF thought projection caused the depressive illness of Michael Dukakis' wife Kitty. Step Four has three parts: Making humanity think an alien invasion is about to occur at every major city; Making the Christians think the Rapture is about to happen; A mixture of electronic and supernatural forces, allowing the supernatural forces to travel through fiber optics, coax, power and telephone lines to penetrate all electronic equipment and appliances, that will by then all have a special microchip installed. Then chaos will break out, and people will finally be willing — perhaps even desperate — to accept the New World Order. "The techniques used in the fourth step is exactly the same used in the past in the USSR to force the people to accept Communism." A device has apparently already been perfected that will lift enormous numbers of people, as in a Rapture. UFO abductions are tests of this device. Project Blue Beam proponents believe psychological preparations have already been made, Monast having claimed that 2001: A Space Odyssey, Star Wars and the Star Trek series all involve an invasion from space and all nations coming together (the first two don't, the third is peaceful contact) and that Jurassic Park propagandises evolution in order to make people think God's words are lies. The book detailed the theory. In the 1994 lecture, Monast detailed what would happen afterwards. All people will be required to take an oath to Lucifer with a ritual initiation to enter the New World Order. Resisters will be categorised as follows: Christian children will be kept for human sacrifice or sexual slaves. Prisoners to be used in medical experiments. Prisoners to be used as living organ banks. Healthy workers in slave labour camps. Uncertain prisoners in the international re-education center, thence to repent on television and learn to glorify the New World Order. The international execution centre. An as yet unknown seventh classification. Joel Engel's book Gene Roddenberry: The Myth and the Man Behind Star Trek was released in 1994, shortly before Monast's lecture on Project Blue Beam: “In May 1975, Gene Roddenberry accepted an offer from Paramount to develop Star Trek into a feature film, and moved back into his old office on the Paramount lot. His proposed story told of a flying saucer, hovering above Earth, that was programmed to send down people who looked like prophets, including Jesus Christ.” All the steps of the conspiracy theory were in the unmade mid-'70s Star Trek film script by Roddenberry, which were recycled for the Star Trek: The Next Generation episode Devil's Due, broadcast in 1991. There is no evidence of deliberate fraud on Monast's part; given his head was quite thoroughly full of squirrels and confetti by this time, it's entirely plausible that he thought this was the revelation of secret information in a guise safe for propagation. However, the actual source was so obvious that even other conspiracy theorists noticed. They confidently state it was obvious that Monast had been fed deceptive information by the CIA. Of course!" -- rationalwiki.org "Serge Monast was a Québécois investigative journalist, poet, essayist and conspiracy theorist. He is known to English-speaking readers mainly for Project Blue Beam and associated conspiracy tropes. His works on Masonic conspiracy theories and the New World Order also remain popular with French-speaking conspiracy theorists and enthusiasts." -- Wikipedia "A human microchip implant is typically an identifying integrated circuit device or RFID transponder encased in silicate glass and implanted in the body of a human being. This type of subdermal implant usually contains a unique ID number that can be linked to information contained in an external database, such as personal identification, law enforcement, medical history, medications, allergies, and contact information. The first experiments with an RFID implant were carried out in 1998 by the British scientist Kevin Warwick. His implant was used to open doors, switch on lights, and cause verbal output within a building. After nine days the implant was removed and has since been held in the Science Museum (London). On 16 March 2009 British scientist Mark Gasson had an advanced glass capsule RFID device surgically implanted into his left hand. In April 2010 Gasson's team demonstrated how a computer virus could wirelessly infect his implant and then be transmitted on to other systems. Gasson reasoned that with implanted technology the separation between man and machine can become theoretical because the technology can be perceived by the human as being a part of their body. Because of this development in our understanding of what constitutes our body and its boundaries he became credited as being the first human infected by a computer virus. He has no plans to remove his implant. Several hobbyists have placed RFID microchip implants into their hands or had them inserted by others. Amal Graafstra, author of the book RFID Toys, asked doctors to place implants in his hands in March 2005. A cosmetic surgeon used a scalpel to place a microchip in his left hand, and his family doctor injected a chip into his right hand using a veterinary Avid injector kit. Graafstra uses the implants to access his home, open car doors, and to log on to his computer. With public interest growing, in 2013 he launched biohacking company Dangerous Things and crowdfunded the world's first implantable NFC transponder in 2014. He has also spoken at various events and promotional gigs including TEDx, and built a smartgun that only fires after reading his implant. Alejandro Hernandez CEO of Futura is known to be the first in Central America to have Dangerous Things' transponder installed in his left hand by Federico Cortes in November 2017. Mikey Sklar had a chip implanted into his left hand and filmed the procedure. Jonathan Oxer self-implanted an RFID chip in his arm using a veterinary implantation tool. Martijn Wismeijer, Dutch marketing manager for Bitcoin ATM manufacturer General Bytes, placed RFID chips in both of his hands to store his Bitcoin private keys and business card. Patric Lanhed sent a “bio-payment” of one euro worth of Bitcoin using a chip embedded in his hand. Marcel Varallo had an NXP chip coated in Bioglass 8625 inserted into his hand between his forefinger and thumb allowing him to open secure elevators and doors at work, print from secure printers, unlock his mobile phone and home, and store his digital business card for transfer to mobile phones enabled for NFC. Biohacker Hannes Sjöblad has been experimenting with NFC (Near Field Communication) chip implants since 2015. During his talk at Echappée Voléé 2016 in Paris, Sjöblad disclosed that he has also implanted himself between his forefinger and thumb and uses it to unlock doors, make payments, and unlock his phone (essentially replacing anything you can put in your pockets). Additionally, Sjöblad has hosted several "implant parties," where interested individuals can also be implanted with the chip. Researchers have examined microchip implants in humans in the medical field and they indicate that there are potential benefits and risks to incorporating the device in the medical field. For example, it could be beneficial for noncompliant patients but still poses great risks for potential misuse of the device. Destron Fearing, a subsidiary of Digital Angel, initially developed the technology for the VeriChip. In 2004, the VeriChip implanted device and reader were classified as Class II: General controls with special controls by the FDA; that year the FDA also published a draft guidance describing the special controls required to market such devices. About the size of a grain of rice, the device was typically implanted between the shoulder and elbow area of an individual’s right arm. Once scanned at the proper frequency, the chip responded with a unique 16-digit number which could be then linked with information about the user held on a database for identity verification, medical records access and other uses. The insertion procedure was performed under local anesthetic in a physician's office. Privacy advocates raised concerns regarding potential abuse of the chip, with some warning that adoption by governments as a compulsory identification program could lead to erosion of civil liberties, as well as identity theft if the device should be hacked. Another ethical dilemma posed by the technology, is that people with dementia could possibly benefit the most from an implanted device that contained their medical records, but issues of informed consent are the most difficult in precisely such people. In June 2007, the American Medical Association declared that "implantable radio frequency identification (RFID) devices may help to identify patients, thereby improving the safety and efficiency of patient care, and may be used to enable secure access to patient clinical information", but in the same year, news reports linking similar devices to cancer caused in laboratory animals had a devastating impact on the company's stock price and sales. In 2010, the company, by then called "PositiveID", withdrew the product from the market due to poor sales. In January 2012, PositiveID sold the chip assets to a company called VeriTeQ that was owned by Scott Silverman, the former CEO of Positive ID. In 2016, JAMM Technologies acquired the chip assets from VeriTeQ; JAMM's business plan was to partner with companies selling implanted medical devices and use the RFID tags to monitor and identify the devices. JAMM Technologies is co-located in the same Plymouth, Minnesota building as Geissler Corporation with Randolph K. Geissler and Donald R. Brattain listed as its principals. The website also claims that Geissler was CEO of PositiveID Corporation, Destron Fearing Corporation, and Digital Angel Corporation. In 2018, A Danish firm called BiChip released a new generation of microchip implant that is intended to be readable from distance and connected to Internet. The company released an update for its microchip implant to associate it with the Ripple cryptocurrency to allow payments to be made using the implanted microchip. In February 2006, CityWatcher, Inc. of Cincinnati, OH became the first company in the world to implant microchips into their employees as part of their building access control and security system. The workers needed the implants to access the company's secure video tape room, as documented in USA Today. The project was initiated and implemented by Six Sigma Security, Inc. The VeriChip Corporation had originally marketed the implant as a way to restrict access to secure facilities such as power plants. A major drawback for such systems is the relative ease with which the 16-digit ID number contained in a chip implant can be obtained and cloned using a hand-held device, a problem that has been demonstrated publicly by security researcher Jonathan Westhues and documented in the May 2006 issue of Wired magazine, among other places. The Baja Beach Club, a nightclub in Rotterdam, the Netherlands, once used VeriChip implants for identifying VIP guests. The Epicenter in Stockholm, Sweden is using RFID implants for employees to operate security doors, copiers, and pay for lunch. In 2017 Mike Miller, chief executive of the World Olympians Association, was widely reported as suggesting the use of such implants in athletes in an attempt to reduce problems in sport due to drug taking. Theoretically, a GPS-enabled chip could one day make it possible for individuals to be physically located by latitude, longitude, altitude, speed, and direction of movement. Such implantable GPS devices are not technically feasible at this time. However, if widely deployed at some future point, implantable GPS devices could conceivably allow authorities to locate missing persons and/or fugitives and those who fled from a crime scene. Critics contend, however, that the technology could lead to political repression as governments could use implants to track and persecute human rights activists, labor activists, civil dissidents, and political opponents; criminals and domestic abusers could use them to stalk and harass their victims; and child abusers could use them to locate and abduct children. Another suggested application for a tracking implant, discussed in 2008 by the legislature of Indonesia's Irian Jaya would be to monitor the activities of persons infected with HIV, aimed at reducing their chances of infecting other people. The microchipping section was not, however, included into the final version of the provincial HIV/AIDS Handling bylaw passed by the legislature in December 2008. With current technology, this would not be workable anyway, since there is no implantable device on the market with GPS tracking capability. Since modern payment methods rely upon RFID/NFC, it is thought that implantable microchips, if they were to ever become popular in use, would form a part of the cashless society. Verichip implants have already been used in nightclubs such as the Baja club for such a purpose, allowing patrons to purchase drinks with their implantable microchip. In a self-published report anti-RFID advocate Katherine Albrecht, who refers to RFID devices as "spy chips", cites veterinary and toxicological studies carried out from 1996 to 2006 which found lab rodents injected with microchips as an incidental part of unrelated experiments and dogs implanted with identification microchips sometimes developed cancerous tumors at the injection site (subcutaneous sarcomas) as evidence of a human implantation risk. However, the link between foreign-body tumorigenesis in lab animals and implantation in humans has been publicly refuted as erroneous and misleading and the report's author has been criticized over the use of "provocative" language "not based in scientific fact". Notably, none of the studies cited specifically set out to investigate the cancer risk of implanted microchips and so none of the studies had a control group of animals that did not get implanted. While the issue is considered worthy of further investigation, one of the studies cited cautioned "Blind leaps from the detection of tumors to the prediction of human health risk should be avoided". The Council on Ethical and Judicial Affairs (CEJA) of the American Medical Association published a report in 2007 alleging that RFID implanted chips may compromise privacy because there is no assurance that the information contained in the chip can be properly protected. Following Wisconsin and North Dakota, California issued Senate Bill 362 in 2007, which makes it illegal to force a person to have a microchip implanted, and provide for an assessment of civil penalties against violators of the bill. In 2008, Oklahoma passed 63 OK Stat § 63-1-1430 (2008 S.B. 47), that bans involuntary microchip implants in humans. On April 5, 2010, the Georgia Senate passed Senate Bill 235 that prohibits forced microchip implants in humans and that would make it a misdemeanor for anyone to require them, including employers. The bill would allow voluntary microchip implants, as long as they are performed by a physician and regulated by the Georgia Composite Medical Board. The state's House of Representatives did not take up the measure. On February 10, 2010, Virginia's House of Delegates also passed a bill that forbids companies from forcing their employees to be implanted with tracking devices. Washington State House Bill 1142-2009-10 orders a study using implanted radio frequency identification or other similar technology to electronically monitor sex offenders and other felons. The general public are most familiar with microchips in the context of tracking their pets. In the U.S., some Christian activists, including conspiracy theorist Mark Dice, the author of a book titled The Resistance Manifesto, make a link between the PositiveID and the Biblical Mark of the Beast, prophesied to be a future requirement for buying and selling, and a key element of the Book of Revelation. Gary Wohlscheid, president of These Last Days Ministries, has argued that "Out of all the technologies with potential to be the mark of the beast, VeriChip has got the best possibility right now"." -- Wikipedia "In this latest book Joseph P Farrell examines the subject of mind control, but from a very unusual perspective, showing that its basic underlying philosophy, and goal, is not only cosmological in nature, but that the cosmology in view is very ancient, and that mind control of any sort, from the arts to hypnosis, remote electromagnetic technologies and “electroencephalographic dictionaries” has cosmological implications." -- Microcosm and Medium: The Cosmic Implications and Agenda of Mind Control Technologies publisher's description
December 18, 2032. Its lunchtime, and the front page of the e-newspaper has a big story about SpaceX landing on Mars again, but the article at the bottom of the page catches your eye. In bold letters it declares BITCOIN IS DEAD (for the 2,437th time), and you can’t help but roll your eyes as you take a sip of your coffee. Why is bitcoin dead? Climate change! The author claims that bitcoin is incentivizing dangerous practices that will destroy our planet! You can’t help but laugh, considering the fact that bitcoin pretty much single-handedly dismantled the oil and gas industry in the mid 2020s. For the past decade, it has been considered a champion of green energy! So why the current change of heart? Allow me to explain: Since the earliest days, bitcoin mining has been competitive. The first blocks were mined on CPUs, but soon GPUs were hacked to hash sha256, and then in 2013, the first ASIC miners hit the market. At the same time, the bitcoin price exploded, and the world began to pay attention a little more seriously. Money started to trickle in, and the race to build the most efficient ASIC miner possible intensified. The mining industry exploded! The best ASICs available were being produced as quickly as possible, and people all around the world were plugging them in, hoping to get lucky. Soon, the network was using an incredible amount of energy, and people started to worry: how much power is too much? However, at the same time, the bitcoin miners were still in stiff competition to be the most efficient. The industry was bigger and more competitive than ever, and since ASIC chips were pushing the bleeding edge of manufacturing technique, miners were forced to look for other ways to innovate in order to gain an advantage. A lot of different schemes were hatched, but the miners that chose to invest in aggressively reducing their energy costs were the ones that survived. As the bitcoin price soared to new heights, the incentive to innovate became extreme, and solar power quickly became the cheapest energy source the world had ever known. Bitcoin was a hero! With the sun burning brightly, humanity could now easily tap into a vast supply of solar energy, soon, massive solar farms were established in ideal locations around the world, collecting every photon they could. The oil and gas giants of the 20th century lost their dominance of the energy market at an unprecedented rate, as advancements in solar cell tech pushed the cost of electricity down an order of magnitude lower than fossil fuels could ever hope to achieve. The final stake in the fossil fuel grave came when a youtube video was released, showing how to easily mod your vehicle’s engine to run on solar power, complete with printable 3D parts files. There was even an optional add-on to install an ASIC miner in the trunk, to take advantage of any excess solar energy your car would collect. Very cool! Greenhouse gas emissions dropped to levels a well-meaning politician could only dream of achieving, and it was all thanks to bitcoin! Prices skyrocketed to levels even the most hardened hodlers had trouble not being surprised by, while at the same time the shitcoin market was a sea of red tears for months on end. It was an incredible thing to witness, no doubt. So then why all the fuss? Why is bitcoin dead, once more? Well, after years of aggressive expansion, miners have now covered approximately 37% of the Earth’s land mass with solar panels, and because of this, the earth’s climate has cooled down considerably, causing violent and unpredictable weather in some areas. Solar energy that would normally heat the earth's atmosphere is now being used to compute rounds of SHA256. The ASIC miners eventually dissipate the stored energy as heat, but since the advent of underground mining practices (to help protect advanced ASIC chips from cosmic ray degradation), this heat is absorbed the by bedrock instead of the air, and the effects have been quite noticeable. Beyond the land, there are even rumours of huge and hostile solar-powered mining farms floating off the coast of Africa. The so-called bitcoin pirates of the high seas! What a time to be alive. But what now? Will bitcoin die? What solutions are possible? There is one group of miners that are battling back by running outdated hardware from the mid 2010’s. They claim that the old ASIC machines run hotter and less efficiently, so they’re helping warm the earth more per hash… but another article called them out as being “energy-wasting, idealist, crypto-hippies”, so maybe that isn’t the best solution after all. You look up the e-news page, and see the SpaceX article staring back at you. Wait! Suddenly it hits you: if the solar panels were in space, humanity’s problem would be solved! You pull out your phone and head straight to twitter to hit up the man himself directly: “@ElonMusk you should build a solar mining farm in space! That would be great. Thx” You can’t help but smile as you put your phone back in your pocket. Long ago you learned that bitcoin isn’t dead, and the faithful hodler has nothing to worry about. And besides, Elon is a smart dude, chances are he’s already two steps ahead of you on this one. Now then, time to check coinmarketcap just once more before you get back to work… [Edit: fixed a typo or two. Edit 2: updated the story to be more thermodynamically correct. Shoutout the physicists in the comments for keeping things in check :D ]
[Article] Debunking the theory that a "deflationary" currency cannot be the basis of a functioning economy
Many economists argue that a low level of inflation (approx 1-2%) is required in order to maintain a productive and functioning economy. This is evidenced in the fact that most central banks have low level inflation as a target of their monetary policy objectives: The European Central Bank, Bank of England, and the Federal Reserve to name a few . As a result, detractors of bitcoin say that it can never become a currency as it is deflationary in nature . That is, there will only ever be 21 million bitcoin in existence. This means that over time once all of these coins are in circulation, there will be no new supply of bitcoin, and so any demand increase will result in a price increase. Currently there is around 4.3% annual inflation of Bitcoin's supply , and by 2028 that is projected to fall to below 1% . Furthermore, if the anonymous 'Satoshi' has truly vanished then there are another 1M coins out of circulation ; and some studies suggest the total number of lost bitcoin is nearing 3M coins , a number that can only increase over time. Due to these 'missing' bitcoins, the supply of Bitcoin will become increasingly scarce, and so their value is expected to rise given a constant or increasing level of demand. This means that goods and services will fall relative to their bitcoin valuation, resulting in deflation (deflation = the price level of goods & services in an economy decreasing). The traditional argument then goes as follows: due to goods & services becoming cheaper over time, saving is incentivized. After all, why would one buy a car for 1000 bits when it can be purchased for 998 bits tomorrow? A common example people point to as evidence for this is the infamous 10,000 BTC pizza purchase in 2010 which at today's valuation costs 100M USD . However, this argument against bitcoin as a currency is flawed on two levels. (1) When pointing to examples such as the pizza purchase, or the rapid increase in bitcoin's value, people are misattributing the cause of the deflation by assuming it is to do with bitcoins supply. In fact, in the years since the pizza purchase, the total supply of bitcoin has increased from 3 million BTC to 16 million BTC. This is a more inflationary supply increase than even the USD over the same period of time . The real causes of bitcoin's price increase (and thus deflationary properties) in this period can be attributed to the parabolic nature of adoption that bitcoin has seen since its creation . When looking at the practical nature of bitcoin as a world currency, and then drawing stats from the coin in its infancy, you are committing the fallacy of false equivalency  as the evidence presented is from a period of increasing adoption while a global currency would imply full or near full adoption. At the 'early adopters' stage we will see major +/- % fluctuations regularly, however if worldwide adoption was to be achieved then these value changes would be far smaller and much less significant. For example, the dollar, the world reserve currency, fluctuates on average by 92 pips in a day (1 pip = 0.0001 USD). Applying this same level of stability to a mass adopted bitcoin, and we see that the price fluctuations would become far smaller and less significant the greater the capitalization of the currency. Thus, in order to assess the viability of bitcoin as a world currency, one must start with a situation where bitcoin is a world currency in the first place. (2) The second flaw of this argument is to assume that deflation will always lead to a deflationary spiral and thus collapse of the economy. With this same logic, one could argue that inflation will always lead to an inflationary spiral and thus an economy collapse as people see price levels rising, and thus are incentivized to spend their money NOW before they increase any further. This then leads prices to rise further and the effect to spiral out of control. CLEARLY though we can see that inflation does not always lead to an inflationary spiral as all western economies operate on an inflationary model. And thus to try use this logic that is empirically flawed as an argument of deflation is self defeating: Levels of inflation will not always lead to inflationary spirals, and levels of deflation will not always lead to deflationary spirals. It is this excessive quantity of inflation or deflation that will result in a spiral, not the attributes of inflation or deflation in isolation. In the same way that 1-2% inflation per year is small enough to not trigger an inflationary spiral of panic, a small amount of deflation on a yearly basis would not trigger this deflationary spiral. In fact, we have evidence to support this claim. In the UK over the period of 1983-2006 we had interests rates that were higher than the rate of inflation , this would mean that consumers are incentivized to save instead of spend as they would have greater purchasing power in the future(i.e. there is deflationary pressure), yet we did not see an economic meltdown during these times. What we actually saw over this time period was a DECREASE in the savings ratio of the average UK household , from around 15% of income to just under 10% despite the fact that any money saved would have compounded 5% more inflation adjusted purchasing power per annum. At first this might seem to be irrational behavior but there are some speculative reasons as to why this was the case. One theory suggests that consumers do not notice inflationary or deflationary pressures in small quantities and thus do not make economic decisions based on them. Another one would say that despite the deflationary pressures, there are some purchases that are necessary and therefore cannot be delayed. i.e. the supermarket shop might be a small % cheaper in 1 years time, but it is necessary to do it now in order to survive. Finally, it can also be argued that as deflationary pressures make consumers feel wealthier, they are more inclined to go out and spend this wealth, thus decreasing their savings rate. The arguments presented above show that perhaps Keynesian economic thinking is too narrow, and that an economy can be run on the back of a currency with deflationary pressures as these pressures in the right quantity will not result in a deflationary spiral, and have the advantage of not eroding the wealth of the population in a way that benefits the wealthy and hinders the poor (see: threshold effects of inflation for more information on this matter). While this article has argued that a deflationary currency can run an economy, it is a topic of future article to discuss which model of the economy is preferable. Till next time, Logical Crypto Sources:  https://en.wikipedia.org/wiki/Inflation_targeting#Summary  https://www.theatlantic.com/business/archive/2013/12/why-bitcoin-will-never-be-a-currency-in-2-charts/282364/  https://charts.bitcoin.com/chart/inflation#lf  https://cointelegraph.com/storage/uploads/view/1d067f3721f10f0a76439de9860a4e54.png  https://qz.com/1107843/bitcoins-btc-new-record-price-of-6000-means-satoshi-nakamoto-is-worth-5-9-billion/  http://uk.businessinsider.com/nearly-4-million-bitcoins-have-been-lost-forever-study-says-2017-11  https://en.bitcoin.it/wiki/Laszlo_Hanyecz  https://upload.wikimedia.org/wikipedia/en/5/58/MB%2C_M1_and_M2_aggregates_from_1981_to_2012.png  https://blockchain.info/charts/n-transactions-total?timespan=all  https://en.wikipedia.org/wiki/False_equivalence  https://www.economicshelp.org/wp-content/uploads/2012/01/inflation-interest-rates-1945-2011.png  https://tradingeconomics.com/united-kingdom/personal-savings
[Sunday, April 8 2018] NASA Has a Plan to Put Robot Bees on Mars; New stroke drug enhances brains ability to rewire itself and promote recovery in the weeks and months after injury; China has started ranking citizens with a creepy 'social credit' system; 'Toy Story 4' Receives 6/21/2019 Release Date
headtailgrep 21 year old Humboldt Broncos player Logan Boulet on life support, will donate organs. Logan signed his donor card as soon as he turned 21, and that even in his passing he would be a hero. Comments || Link
Sariel007 In small communities throughout Northern NM, there are veterans who either cannot afford a car, or are too old or disabled to drive to Albuquerque for their medical treatment. For the last 16 years, a pair of retired Army soldiers have volunteered their time to drive The disabled veterans Comments || Link
drewiepoodle [Title Post] New stroke drug enhances brain's ability to rewire itself and promote recovery in the weeks and months after injury. In the study, mice and monkeys that suffered strokes regained more movement and dexterity when their rehabilitative regimen included the experimental medication. Comments || Link
mvea The first comprehensive study of China’s STEM research environment based on 731 surveys by STEM faculty at China’s top 25 universities found a system that stifles creativity and critical thinking needed for innovation, hamstrings researchers with bureaucracy, and rewards quantity over quality. Comments || Link
WolverineCerebellum TIL that a single Wolverine crossed the ice of Lake Huron into Michigan, where it lived out its days as the only one in the entire state. A local science teacher followed and studied her closely for years until her death. Upon finding her dead he said, "I feel like I lost a member of my family." Comments || Link
simcity4000 TIL People with depression are more likely to use absolutist words e.g: “always” “never” “completely” Comments || Link
dobrich91 An emotional scene tonight in Winnipeg as the Jets and the Blackhawks all play with “Broncos” on their jerseys in support of the horrific crash that claimed the lives of 15 members of the Humboldt Broncos junior hockey team Comments || Link
el_Byrno Glenn Howerton on Twitter: "Apparently some bird has been going around squawking about Dennis being confirmed for season 13. We don’t have a bird on our writing staff. Birds don’t know anything" Comments
SuIIy Possibly the greatest aerial shot of any city in the world, ever. Alfred G. Buckham's matchless Aerial view of Edinburgh (1920). In order to get dazzling shots like this, Buckham would hang out of the aircraft with his leg tied to the seat with a rope. Comments || Link
4waylunch Meet Sloppy Joe! He was a neighborhood cat near my office. For 2 years, he would greet me in the parking lot promptly at 7:00a.m. and make my day! He disappeared for 3 months and at Christmas I found him at a shelter where his owners left him. We are back to our morning routine! Comments || Link
Console gaming is hardly different from PC gaming, and much of what people say about PC gaming to put it above console gaming is often wrong.
I’m not sure about you, but for the past few years, I’ve been hearing people go on and on about PCs "superiority" to the console market. People cite various reasons why they believe gaming on a PC is “objectively” better than console gaming, often for reasons related to power, costs, ease-of-use, and freedom. …Only problem: much of what they say is wrong. There are many misconceptions being thrown about PC gaming vs Console gaming, that I believe need to be addressed. This isn’t about “PC gamers being wrong,” or “consoles being the best,” absolutely not. I just want to cut through some of the stuff people use to put down console gaming, and show that console gaming is incredibly similar to PC gaming. I mean, yes, this is someone who mainly games on console, but I also am getting a new PC that I will game on as well, not to mention the 30 PC games I already own and play. I’m not particularly partial to one over the other. Now I will mainly be focusing on the PlayStation side of the consoles, because I know it best, but much of what I say will apply to Xbox as well. Just because I don’t point out many specific Xbox examples, doesn’t mean that they aren’t out there.
“PCs can use TVs and monitors.”
This one isn’t so much of a misconception as it is the implication of one, and overall just… confusing. This is in some articles and the pcmasterrace “why choose a PC” section, where they’re practically implying that consoles can’t do this. I mean, yes, as long as the ports of your PC match up with your screen(s) inputs, you could plug a PC into either… but you could do the same with a console, again, as long as the ports match up. I’m guessing the idea here is that gaming monitors often use Displayport, as do most dedicated GPUs, and consoles are generally restricted to HDMI… But even so, monitors often have HDMI ports. In fact, PC Magazine has just released their list of the best gaming monitors of 2017, and every single one of them has an HDMI port. A PS4 can be plugged into these just as easily as a GTX 1080. I mean, even if the monitoTV doesn’t have HDMI or AV to connect with your console, just use an adaptor. If you have a PC with ports that doesn’t match your monitoTV… use an adapter. I don’t know what the point of this argument is, but it’s made a worrying amount of times.
“On PC, you have a wide range of controller options, but on console you’re stuck with the standard controller."
Are you on PlayStation and wish you could use a specific type of controller that suits your favorite kind of gameplay? Despite what some may believe, you have just as many options as PC. Want to play fighting games with a classic arcade-style board, featuring the buttons and joystick? Here you go! Want to get serious about racing and get something more accurate and immersive than a controller? Got you covered. Absolutely crazy about flying games and, like the racers, want something better than a controller? Enjoy! Want Wii-style motion controls? Been around since the PS3. If you prefer the form factor of the Xbox One controller but you own a PS4, Hori’s got you covered. And of course, if keyboard and mouse it what keeps you on PC, there’s a PlayStation compatible solution for that. Want to use the keyboard and mouse that you already own? Where there’s a will, there’s a way. Of course, these aren’t isolated examples, there are plenty of options for each of these kind of controllers. You don’t have to be on PC to enjoy alternate controllers.
“On PC you could use Steam Link to play anywhere in your house and share games with others.”
PS4 Remote play app on PC/Mac, PSTV, and PS Vita. PS Family Sharing. Using the same PSN account on multiple PS4s/Xbox Ones and PS3s/360s, or using multiple accounts on the same console. In fact, if multiple users are on the same PS4, only one has to buy the game for both users to play it on that one PS4. On top of that, only one of them has to have PS Plus for both to play online (if the one with PS Plus registers the PS4 as their main system). PS4 Share Play; if two people on separate PS4s want to play a game together that only one of them owns, they can join a Party and the owner of the game can have their friend play with them in the game. Need I say more?
“Gaming is more expensive on console.”
Part one, the Software This is one that I find… genuinely surprising. There’s been a few times I’ve mentioned that part of the reason I chose a PS4 is for budget gaming, only to told that “games are cheaper on Steam.” To be fair, there are a few games on PSN/XBL that are more expensive than they are on Steam, so I can see how someone could believe this… but apparently they forgot about disks. Dirt Rally, a hardcore racing sim game that’s… still $60 on all 3 platforms digitally… even though its successor is out.
See my point? Often times the game is cheaper on console because of the disk alternative that’s available for practically every console-available game. Even when the game is brand new. Dirt 4 - Remember that Dirt Rally successor I mentioned?
Yes, you could either buy this relatively new game digitally for $60, or just pick up the disk for a discounted price. And again, this is for a game that came out 2 months ago, and even it’s predecessor’s digital cost is locked at $60. Of course, I’m not going to ignore the fact that Dirt 4 is currently (as of writing this) discounted on Steam, but on PSN it also happens to be discounted for about the same amount. Part 2: the Subscription Now… let’s not ignore the elephant in the room: PS Plus and Xbox Gold. Now these would be ignorable, if they weren’t required for online play (on the PlayStation side, it’s only required for PS4, but still). So yes, it’s still something that will be included in the cost of your PS4 or Xbox One/360, assuming you play online. Bummer, right? Here’s the thing, although that’s the case, although you have to factor in this $60 cost with your console, you can make it balance out, at worst, and make it work out for you as a budget gamer, at best. As nice as it would be to not have to deal with the price if you don’t want to, it’s not like it’s a problem if you use it correctly. Imagine going to a new restaurant. This restaurant has some meals that you can’t get anywhere else, and fair prices compared to competitors. Only problem: you have to pay a membership fee to have the sides. Now you can have the main course, sit down and enjoy your steak or pasta, but if you want to have a side to have a full meal, you have to pay an annual fee. Sounds shitty, right? But here’s the thing: not only does this membership allow you to have sides with your meal, but it also allows you to eat two meals for free every month, and also gives you exclusive discounts for other meals, drinks, and desserts. Let’s look at PS Plus for a minute: for $60 per year, you get:
2 free PS4 games, every month
2 free PS3 games, every month
1 PS4/PS3 and Vita compatible game, and 1 Vita-only game, every month
Exclusive/Extended discounts, especially during the weekly/seasonal sales (though you don’t need PS Plus to get sales, PS Plus members get to enjoy the best sales)
access to online multiplayer
So yes, you’re paying extra because of that membership, but what you get with that deal pays for it and then some. In fact, let’s ignore the discounts for a minute: you get 24 free PS4 games, 24 free PS3 games, and 12 Vita only + 12 Vita compatible games, up to 72freegames every year. Even if you only one of these consoles, that’s still 24 free games a year. Sure, maybe you get games for the month that you don’t like, then just wait until next month. In fact, let’s look at Just Cause 3 again. It was free for PS Plus members in August, which is a pretty big deal. Why is this significant? Because it’s, again, a $60 digital game. That means with this one download, you’ve balanced out your $60 annual fee. Meaning? Every free game after that is money saved, every discount after that is money saved. And this is a trend: every year, PS Plus will release a game that balances out the entire service cost, then another 23 more that will only add icing to that budget cake. Though, you could just count games as paying off PS Plus until you hit $60 in savings, but still. All in all, PS Plus, and Xbox Gold which offers similar options, saves you money. On top of that, again, you don't need to have these to get discounts, but with these memberships, you get more discounts. Now, I’ve seen a few Steam games go up for free for a week, but what about being free for an entire month? Not to mention that; even if you want to talk about Steam Summer Sales, what about the PSN summer sale, or again, disc sale discounts? Now a lot of research and math would be needed to see if every console gamer would save money compared to every Steam gamer for the same games, but at the very least? The costs will balance out, at worst. Part 3, the Systems
Xbox and PS2: $299
Xbox 360 and PS3: $299 and $499, respectively
Xbox One and PS4: $499 and $399, respectively.
Rounded up a few dollars, that’s $1,000 - $1,300 in day-one consoles, just to keep up with the games! Crazy right? So called budget systems, such a rip-off. Well, keep in mind that the generations here aren’t short. The 6th generation, from the launch of the PS2 to the launch of the next generation consoles, lasted 5 years, 6 years based on the launch of the PS3 (though you could say it was 9 or 14, since the Xbox wasn’t discontinued until 2009, and the PS2 was supported all the way to 2014, a year after the PS4 was released). The 7th gen lasted 7 - 8 years, again depending on whether you count the launch of the Xbox 360 to PS3. The 8th gen so far has lasted 4 years. That’s 17 years that the console money is spread over. If you had a Netflix subscription for it’s original $8 monthly plan for that amount of time, that would be over $1,600 total. And let’s be fair here, just like you could upgrade your PC hardware whenever you wanted, you didn’t have to get a console from launch. Let’s look at PlayStation again for example: In 2002, only two years after its release, the PS2 retail price was cut from $300 to $200. The PS3 Slim, released 3 years after the original, was $300, $100-$200 lower than the retail cost. The PS4? You could’ve either gotten the Uncharted bundle for $350, or one of the PS4 Slim bundles for $250. This all brings it down to $750 - $850, which again, is spread over a decade and a half. This isn’t even counting used consoles, sales, or the further price cuts that I didn’t mention. Even if that still sounds like a lot of money to you, even if you’re laughing at the thought of buying new systems every several years, because your PC “is never obsolete,” tell me: how many parts have you changed out in your PC over the years? How many GPUs have you been through? CPUs? Motherboards? RAM sticks, monitors, keyboards, mice, CPU coolers, hard drives— that adds up. You don’t need to replace your entire system to spend a lot of money on hardware. Even if you weren’t upgrading for the sake of upgrading, I’d be amazed if the hardware you’ve been pushing by gaming would last for about 1/3 of that 17 year period. Computer parts aren’t designed to last forever, and really won’t when you’re pushing them with intensive gaming for hours upon hours. Generally speaking, your components might last you 6-8 years, if you’ve got the high-end stuff. But let’s assume you bought a system 17 years ago that was a beast for it’s time, something so powerful, that even if it’s parts have degraded over time, it’s still going strong. Problem is: you will have to upgrade something eventually. Even if you’ve managed to get this far into the gaming realm with the same 17 year old hardware, I’m betting you didn’t do it with a 17 year Operating System. How much did Windows 7 cost you? Or 8.1? Or 10? Oh, and don’t think you can skirt the cost by getting a pre-built system, the cost of Windows is embedded into the cost of the machine (why else would Microsoft allow their OS to go on so many machines). Sure, Windows 10 was a free upgrade for a year, but that’s only half of it’s lifetime— You can’t get it for free now, and not for the past year. On top of that, the free period was an upgrade; you had to pay for 7 or 8 first anyway. Point is, as much as one would like to say that they didn’t need to buy a new system every so often for the sake of gaming, that doesn’t mean they haven’t been paying for hardware, and even if they’ve only been PC gaming recently, you’ll be spending money on hardware soon enough.
“PC is leading the VR—“
Let me stop you right there. If you add together the total number of Oculus Rifts and HTC Vives sold to this day, and threw in another 100,000 just for the sake of it, that number would still be under the number of PSVR headsets sold. Why could this possibly be? Well, for a simple reason: affordability. The systems needed to run the PC headsets costs $800+, and the headsets are $500 - $600, when discounted. PSVR on the other hand costs $450 for the full bundle (headset, camera, and move controllers, with a demo disc thrown in), and can be played on either a $250 - $300 console, or a $400 console, the latter recommended. Even if you want to say that the Vive and Rift are more refined, a full PSVR set, system and all, could cost just over $100 more than a Vive headset alone. If anything, PC isn’t leading the VR gaming market, the PS4 is. It’s the system bringing VR to the most consumers, showing them what the future of gaming could look like. Not to mention that as the PlayStation line grows more powerful (4.2 TFLOP PS4 Pro, 10 TFLOP “PS5…”), it won’t be long until the PlayStation line can use the same VR games as PC. Either way, this shows that there is a console equivalent to the PC VR options. Sure, there are some games you'd only be able to play on PC, but there are also some games you'd only be able to play on PSVR. …Though to be fair, if we’re talking about VR in general, these headsets don’t even hold a candle to, surprisingly, Gear VR.
“If it wasn’t for consoles holding devs back, then they would be able to make higher quality games.”
This one is based on the idea that because of how “low spec” consoles are, that when a developer has to take them in mind, then they can’t design the game to be nearly as good as it would be otherwise. I mean, have you ever seen the minimum specs for games on Steam? GTA V
Actually, bump up all the memory requirements to 8 GBs, and those are some decent specs, relatively speaking. And keep in mind these are the minimum specs to even open the games. It’s almost as if the devs didn’t worry about console specs when making a PC version of the game, because this version of the game isn’t on console. Or maybe even that the consoles aren’t holding the games back that much because they’re not that weak. Just a hypothesis. But I mean, the devs are still ooobviously having to take weak consoles into mind right? They could make their games sooo much more powerful if they were PC only, right? Right? No. Not even close. iRacing
CPU: Intel Core i3, i5, i7 or better or AMD Bulldozer or better
Memory: 8 GB RAM
GPU: NVidia GeForce 2xx series or better, 1GB+ dedicated video memory / AMD 5xxx series or better, 1GB+ dedicated video memory
These are PC only games. That’s right, no consoles to hold them back, they don’t have to worry about whether an Xbox One could handle it. Yet, they don’t require anything more than the Multiplatform games. Subnautica
So what’s the deal? Theoretically, if developers don’t have to worry about console specs, then why aren’t they going all-out and making games that no console could even dream of supporting? Low-end PCs. What, did you think people only game on Steam if they spent at least $500 on gaming hardware? Not all PC gamers have gaming-PC specs, and if devs close their games out to players who don’t have the strongest of PCs, then they’d be losing out on a pretty sizable chunk of their potential buyers. Saying “devs having to deal with consoles is holding gaming back” is like saying “racing teams having to deal with Ford is holding GT racing back.” A: racing teams don’t have to deal with Ford if they don’t want to, which is probably why many of them don’t, and B: even though Ford doesn’t make the fastest cars overall, they still manage to make cars that are awesome on their own, they don’t even need to be compared to anything else to know that they make good cars. I want to go back to that previous point though, developers having to deal with low-end PCs, because it’s integral to the next point:
“PCs are more powerful, gaming on PC provides a better experience.”
This one isn’t so much of a misconception as it is… misleading. Did you know that according to the Steam Hardware & Software Survey (July 2017) , the percentage of Steam gamers who use a GPU that's less powerful than that of a PS4Slim’s GPU is well over 50%? Things get dismal when compared to the PS4 Pro (Or Xbox One X). On top of that, the percentage of PC gamers who own a Nvidia 10 series card is about 20% (about 15% for the 1060, 1080 and 1070 owners). Now to be fair, the large majority of gamers have CPUs with considerably high clock speeds, which is the main factor in CPU gaming performance. But, the number of Steam gamers with as much RAM or more than a PS4 or Xbox One is less than 50%, which can really bottleneck what those CPUs can handle. These numbers are hardly better than they were in 2013, all things considered. Sure, a PS3/360 weeps in the face of even a $400 PC, but in this day in age, consoles have definitely caught up. Sure, we could mention the fact that even 1% of Steam accounts represents over 1 million accounts, but that doesn’t really matter compared to the 10s of millions of 8th gen consoles sold; looking at it that way, sure the number of Nvidia 10 series owners is over 20 million, but that ignores the fact that there are over 5 times more 8th gen consoles sold than that. Basically, even though PCs run on a spectrum, saying they're more powerful “on average” is actually wrong. Sure, they have the potential for being more powerful, but most of the time, people aren’t willing to pay the premium to reach those extra bits of performance. Now why is this important? What matters are the people who spent the premium cost for premium parts, right? Because of the previous point: PCs don’t have some ubiquitous quality over the consoles, developers will always have to keep low-end PCs in mind, because not even half of all PC players can afford the good stuff, and you have to look at the top quarter of Steam players before you get to PS4-Pro-level specs. If every Steam player were to get a PS4 Pro, it would be an upgrade for over 60% of them, and 70% of them would be getting an upgrade with the Xbox One X. Sure, you could still make the argument that when you pay more for PC parts, you get a better experience than you could with a console. We can argue all day about budget PCs, but a console can’t match up to a $1,000 PC build. It’s the same as paying more for car parts, in the end you get a better car. However, there is a certain problem with that…
“You pay a little more for a PC, you get much more quality.”
The idea here is that the more you pay for PC parts, the performance increases at a faster rate than the price does. Problem: that’s not how technology works. Paying twice as much doesn’t get you twice the quality the majority of the time. For example, let’s look at graphics cards, specifically the GeForce 10 series cards, starting with the GTX 1050.
1.35 GHz base clock
2 GB VRAM
This is our reference, our basis of comparison. Any percentages will be based on the 1050’s specs. Now let’s look at the GTX 1050 Ti, the 1050’s older brother.
1.29 GHz base clock
4 GB VRAM
This is pretty good. You only increase the price by about 27%, and you get an 11% increase in floating point speed and a 100% increase (double) in VRAM. Sure you get a slightly lower base clock, but the rest definitely makes up for it. In fact, according to GPU boss, the Ti managed 66 fps, or a 22% increase in frame rate for Battlefield 4, and a 54% increase in mHash/second in bitcoin mining. The cost increase is worth it, for the most part. But let’s get to the real meat of it; what happens when we double our budget? Surely we should see a massive increase performance, I bet some of you are willing to bet that twice the cost means more than twice the performance. The closest price comparison for double the cost is the GTX 1060 (3 GB), so let’s get a look at that.
1.5 GHz base clock
3 GB VRAM
Well… not substantial, I’d say. About a 50% increase in floating point speed, an 11% increase in base clock speed, and a 1GB decrease in VRAM. For [almost] doubling the price, you don’t get much. Well surely raw specs don’t tell the full story, right? Well, let’s look at some real wold comparisons. Once again, according to GPU Boss, there’s a 138% increase in hashes/second for bitcoin mining, and at 99 fps, an 83% frame rate increase in Battlefield 4. Well, then, raw specs does not tell the whole story! Here’s another one, the 1060’s big brother… or, well, slightly-more-developed twin.
1.5 GHz base clock
6 GB VRAM
Seems reasonable, another $50 for a decent jump in power and double the memory! But, as we’ve learned, we shouldn’t look at the specs for the full story. I did do a GPU Boss comparison, but for the BF4 frame rate, I had to look at Tom’s Hardware (sorry miners, GPU boss didn’t cover the mHash/sec spec either). What’s the verdict? Well, pretty good, I’d say. With 97 FPS, a 79% increase over the 1050— wait. 97? That seems too low… I mean, the 3GB version got 99. Well, let’s see what Tech Power Up has to say... 94.3 fps. 74% increase. Huh. Alright alright, maybe that was just a dud. We can gloss over that I guess. Ok, one more, but let’s go for the big fish: the GTX 1080.
1.6 GHz base clock
8 GB VRAM
That jump in floating point speed definitely has to be something, and 4 times the VRAM? Sure it’s 5 times the price, but as we saw, raw power doesn’t always tell the full story. GPU Boss returns to give us the run down, how do these cards compare in the real world? Well… a 222% (over three-fold) increase in mHash speed, and a 218% increase in FPS for Battlefield 4. That’s right, for 5 times the cost, you get 3 times the performance. Truly, the raw specs don’t tell the full story. You increase the cost by 27%, you increase frame rate in our example game by 22%. You increase the cost by 83%, you increase the frame rate by 83%. Sounds good, but if you increase the cost by 129%, and you get a 79% (-50% cost/power increase) increase in frame rate. You increase it by 358%, and you increase the frame rate by 218% (-140% cost/power increase). That’s not paying “more for much more power,” that’s a steep drop-off after the third cheapest option. In fact, did you know that you have to get to the 1060 (6GB) before you could compare the GTX line to a PS4 Pro? Not to mention that at $250, the price of a 1060 (6GB) you could get an entire PS4 Slim bundle, or that you have to get to the 1070 before you beat the Xbox One X. On another note, let’s look at a PS4 Slim…
800 MHz base clock
8 GB VRAM
…Versus a PS4 Pro.
911 MHz base clock
8 GB VRAM
128% increase in floating point speed, 13% increase in clock speed, for a 25% difference in cost. Unfortunately there is no Battlefield 4 comparison to make, but in BF1, the frame rate is doubled (30 fps to 60) and the textures are taken to 11. For what that looks like, I’ll leave it up to this bloke. Not to even mention that you can even get the texture buffs in 4K. Just like how you get a decent increase in performance based on price for the lower-cost GPUs, the same applies here. It’s even worse when you look at the CPU for a gaming PC. The more money you spend, again, the less of a benefit you get per dollar. Hardware Unboxed covers this in a video comparing different levels of Intel CPUs. One thing to note is that the highest i7 option (6700K) in this video was almost always within 10 FPS (though for a few games, 15 FPS) of a certain CPU in that list for just about all of the games. …That CPU was the lowest i3 (6100) option. The lowest i3 was $117 and the highest i7 was $339, a 189% price difference for what was, on average, a 30% or less difference in frame rate. Even the lowest Pentium option (G4400, $63) was often able to keep up with the i7. The CPU and GPU are usually the most expensive and power-consuming parts of a build, which is why I focused on them (other than the fact that they’re the two most important parts of a gaming PC, outside of RAM). With both, this “pay more to get much more performance” idea is pretty much the inverse of the truth.
“The console giants are bad for game developers, Steam doesn't treat developers as bad as Microsoft or especially Sony.”
Now one thing you might’ve heard is that the PS3 was incredibly difficult for developers to make games for, which for some, fueled the idea that console hardware is difficult too develop on compared to PC… but this ignores a very basic idea that we’ve already touched on: if the devs don’t want to make the game compatible with a system, they don’t have to. In fact, this is why Left 4 Dead and other Valve games aren’t on PS3, because they didn’t want to work with it’s hardware, calling it “too complex.” This didn’t stop the game from selling well over 10 million units worldwide. If anything, this was a problem for the PS3, not the dev team. This also ignores that games like LittleBigPlanet, Grand Theft Auto IV, and Metal Gear Solid 4 all came out in the same year as Left 4 Dead (2008) on PS3. Apparently, plenty of other dev teams didn’t have much of a problem with the PS3’s hardware, or at the very least, they got used to it soon enough. On top of that, when developing the 8th gen consoles, both Sony and Microsoft sought to use CPUs that were easier for developers, which included making decisions that considered apps for the consoles’ usage for more than gaming. On top of that, using their single-chip proprietary CPUs is cheaper and more energy efficient than buying pre-made CPUs and boards, which is far better of a reason for using them than some conspiracy about Sony and MS trying to make devs' lives harder. Now, console exclusives are apparently a point of contention: it’s often said that exclusive can cause developers to go bankrupt. However, exclusivity doesn’t have to be a bad thing for the developer. For example, when Media Molecule had to pitch their game to a publisher (Sony, coincidentally), they didn’t end up being tied into something detrimental to them. Their initial funding lasted for 6 months. From then, Sony offered additional funding, in exchange for Console Exclusivity. This may sound concerning to some, but the game ended up going on to sell almost 6 million units worldwide and launched Media Molecule into the gaming limelight. Sony later bought the development studio, but 1: this was in 2010, two years after LittleBigPlanet’s release, and 2: Media Molecule seem pretty happy about it to this day. If anything, signing up with Sony was one of the best things they could’ve done, in their opinion. Does this sound like a company that has it out for developers? There are plenty of examples that people will use to put Valve in a good light, but even Sony is comparatively good to developers.
“There are more PC gamers.”
The total number of active PC gamers on Steam has surpassed 120 million, which is impressive, especially considering that this number is double that of 2013’s figure (65 million). But the number of monthly active users on Xbox Live and PSN? About 120 million (1, 2) total. EDIT: You could argue that this isn't an apples-to-apples comparison, sure, so if you want to, say, compare the monthly number of Steam users to console? Steam has about half of what consoles do, at 67 million. Now, back to the 65 million total user figure for Steam, the best I could find for reference for PlayStation's number was an article giving the number of registered PSN accounts in 2013, 150 million. In a similar 4-year period (2009 - 2013), the number of registered PSN accounts didn’t double, it sextupled, or increased by 6 fold. Considering how the PS4 is already at 2/3 of the number of sales the PS3 had, even though it’s currently 3 years younger than its predecessor, I’m sure this trend is at least generally consistent. For example, let’s look at DOOM 2016, an awesome faced-paced shooting title with graphics galore… Of course, on a single platform, it sold best on PC/Steam. 2.36 million Steam sales, 2.05 million PS4 sales, 1.01 million Xbox One sales. But keep in mind… when you add the consoles sales together, you get over 3 million sales on the 8th gen systems. Meaning: this game was best sold on console. In fact, the Steam sales have only recently surpassed the PS4 sales. By the way VG charts only shows sales for physical copies of the games, so the number of PS4 and Xbox sales, when digital sales are included, are even higher than 3 million. This isn’t uncommon, by the way. Even with the games were the PC sales are higher than either of the consoles, there generally are more console sales total. But, to be fair, this isn’t anything new. The number of PC gamers hasn’t dominated the market, the percentages have always been about this much. PC can end up being the largest single platform for games, but consoles usually sell more copies total. EDIT: There were other examples but... Reddit has a 40,000-character limit.
This isn’t to say that there’s anything wrong with PC gaming, and this isn’t to exalt consoles. I’m not here to be the hipster defending the little guy, nor to be the one to try to put down someone/thing out of spite. This is about showing that PCs and consoles are overall pretty similar because there isn’t much dividing them, and that there isn’t anything wrong with being a console gamer. There isn’t some chasm separating consoles and PCs, at the end of the day they’re both computers that are (generally) designed for gaming. This about unity as gamers, to try to show that there shouldn’t be a massive divide just because of the computer system you game on. I want gamers to be in an environment where specs don't separate us; whether you got a $250 PS4 Slim or just built a $2,500 gaming PC, we’re here to game and should be able to have healthy interactions regardless of your platform. I’m well aware that this isn’t going to fix… much, but this needs to be said: there isn’t a huge divide between the PC and consoles, they’re far more similar than people think. There are upsides and downsides that one has that the other doesn’t on both sides. There’s so much more I could touch on, like how you could use SSDs or 3.5 inch hard drives with both, or that even though PC part prices go down over time, so do consoles, but I just wanted to touch on the main points people try to use to needlessly separate the two kinds of systems (looking at you PCMR) and correct them, to get the point across. I thank anyone who takes the time to read all of this, and especially anyone who doesn’t take what I say out of context. I also want to note that, again, thisisn’t “anti-PC gamer.” If it were up to me, everyone would be a hybrid gamer. Cheers.
The first Bitcoin mining pool, Slush Pool, was announced in November 2010, and, by December 13, 2010, Satoshi Nakamoto had posted for the last time on BitcoinTalk.org. Justin O'Connell Thus, she suggests that crypto companies could benefit from the model the EV industry applied a decade ago, when “Like with bitcoin and cryptocurrency today, exactly what talent the EV needed in 2010 was unclear.” Cars. Real Estate. Events. Search site. Bitcoin soars: $100 in 2010 is worth $75 million today According to CNBC, the price of a single bitcoin has recently soared to $2,200 from just $0 A rational observer would say that bitcoin, which is both a store of value and a medium of exchange, is money. But the IRS, enforcing legislation written in a pre-internet age, has concluded that The government also raised the sales tax on small cars to 7.5 percent this year after halving it to 5 percent in 2009. The tax cut helped last year’s vehicle sales surge 46 percent to 13.6
Bitcoin: My 400 bitcoin bet paid off, but is it too late for others? - BBC News
Cheap Reliable Cars Under $5000!For less than $5,000 you can get your hands on one of these 5 cheap used cars that will last you for a long time. Not only are they reliable but they are also ... Onecoin promised the world, but only proved to be a trail of destruction. --- About ColdFusion --- ColdFusion is an Australian based online media company independently run by Dagogo Altraide since ... Entrepreneur Alessandra Sollberger funded her nutrition start-up off the back of an investment in Bitcoin. But as Rory Cellan-Jones hears, it may be too late for others to follow in her steps ... قوة الأمل تختصر المسافة بين الأرض والسمـاء. الحرب في #ليبيا علي الأبواب يا مصريين وسط صمت وتواطؤ عالمي Hana El ... How to use the AutoSummarize Feature in Microsoft Word. How To Pay Off Your Mortgage Fast Using Velocity Banking How To Pay Off Your Mortgage In 5-7 Years - Duration: 41:34. Think Wealthy with ...