• Changing RCF's index page, please click on "Forums" to access the forums.

Building a Gaming PC

Do Not Sell My Personal Information
Gaming monitors are expensive, but for first perdon shooters they make a big difference. Couple a nice gaming monitor with a high end video card and its gaming heaven.
 
https://translate.google.com/translate?hl=en&sl=fr&tl=en&u=https://www.cpchardware.com/intel-prepare-la-riposte-a-ryzen/

Intel is preparing for Ryzen. If this is true it sounds like Ryzen might actually be the real deal. I don't remember Intel responding to anythng from AMD in like 10 plus years.

"Leaked by Canard PC The article describes the I7-7740k and I5-7640k with higher clock speeds and a TDP of 100w vs the 92w of the current parts.

While an extra 100mhz base isn't a big deal, the article goes on to claim that the 7640k will come with hyper threading which traditionally hasn't been available for any I5 SKU's.

If true, this is a massive value increase from Intel on the I5 as the difference between these 2 level of SKU's has been around $100"

https://www.reddit.com/r/pcgaming/comments/5sl951/intel_readies_kaby_lake_core_i77740k_and_c
@Jack Brickman
@gourimoko
@Chris
 
Last edited:
I hope that's the case. I've decided I'm doing a big boy build this spring/summer as I can assemble stuff on the cheap. Waiting to see AMD's new products though hoping that they push intel/nvidia down a tad.

Ordered an evga 850 psu for 70 bucks already though :D
 
Intel may use AMD GPUs to challenge Nvidia's rising power

"
AMD already owns the graphics in consoles, and the next place it could plant its flag could be the last anyone expected: Intel.

While rumors of a possible deal have circulated all year, something firmer arose Monday night when Kyle Bennett, longtime editor of enthusiast hardware site HardOCP.com, posted that the ink on the deal was already dry. “The licensing deal between AMD and Intel is signed and done for putting AMD GPU tech into Intel’s iGPU,” Bennett wrote on his site’s forum.

Officials PCWorld contacted at both companies declined to comment, but reached Wednesday morning, Bennett stood by his comments and added a little more detail.

“To my understanding, Intel has a team of about ~1,000 engineers working on their forward-looking iGPU technology,” Bennett told PCWorld. “Basically, that work will be scrapped and that team and their work will be replaced with AMD teams and technology going forward. There are also Apple implications here as well, and this deal is good for Apple assuredly.”

As bizarre as such a partnership may sound to outsiders, the timing actually makes it more likely. Kevin Krewell, an analyst with Tirias Research, laid out two possible scenarios in a column at Forbes.com that was published Tuesday evening.

First, Intel needs patent protection. Nvidia and Intel began suing each other in 2009 over Nvidia’s nForce chipsets for Intel CPUs. The suits were eventually settled in 2011: Nvidia agreed not to build chipsets for Intel’s Core i7 CPUs, and Intel was free to build graphics cores without getting sued by Nvidia.

The price of Intel’s freedom was high, though: The chip giant agreed to pay Nvidia licensing fees over the next six years totalling $1.5 billion.

After writing the last $200 million check in January 2016, the licensing deal is winding down, which means Intel has to go shopping for patent protection for its graphics cores. As AMD and Nvidia essentially own the lion’s share of graphics patents in the world, developing graphics cores is nigh impossible without licensing deals. Krewell said Intel could just ink a deal and be done with it. The second scenario, however, is far more intriguing, if, as Bennett says, Intel uses Radeon graphics inside of Intel CPUs.

Krewell said the deal would give AMD some much-needed funding. “AMD still has some significant financial headwinds with its debt load and needs cash to fund more R&D,” Krewell said to PCWorld late Tuesday. “The way I’d rationalize AMD’s licensing of Radeon GPU tech to Intel is that Radeon would become the dominant graphics architecture of the PC market and outflank Nvidia in graphics. If Intel then used Radeon GPUs for GPU computing, it would help push back on Nvidia and CUDA.”

Such a deal wouldn’t come cheap, but Intel was already cutting checks of $200 million to $300 million to Nvidia every year. “Intel would have to pony up some significant money to make this deal work,” Krewell told PCWorld. “The amount of extra cash AMD could make on royalties would be very appealing to the shareholders.”

Fans may be concerned that such a deal would all but give up the last advantage AMD’s upcoming Zen-based APUs would have over Intel chips. AMD’s Zen core could equal Intel’s newest cores in x86 performance. Combine that with AMD’s much more powerful graphics cores and you’d have an instant winner.

Financial realities, however, overshadow any moral victories. “Is it better to make a royalty on 80 percent to 90 percent of the PC processor shipments or fight it out for the remaining 10 percent or 20 percent?” Krewell said. AMD can make a lot more money partnering with Intel rather than competing.

For its part, Intel has plenty of reasons to stop sending money to Nvidia. As the GPU maker busily builds market share in self-driving cars, machine learning and more, it’s becoming more of a threat to Intel (which is trying hard to get its own chunk of these businesses). In AMD, Intel would have a partner that offered competitive technology to Nvidia’s—and needed its money. We’ll continue to follow this story and will let you know when we learn more."

http://www.pcworld.com/article/3147...d-gpus-to-challenge-nvidias-rising-power.html
 
Apparently AMD's top of the line 8-core Ryzen chip will only be $450 and is supposedly on par with a $1,000 Intel chip.

This should be interesting.
 
Apparently AMD's top of the line 8-core Ryzen chip will only be $450 and is supposedly on par with a $1,000 Intel chip.

This should be interesting.
Intel is supposedly adding an I5 with hyperthreading to their lineup. There must be something there with Ryzen to get them to react.

Hopefully Intel's next generation moves i7 to 8 core and i5 to 6.
 
Intel is supposedly adding an I5 with hyperthreading to their lineup. There must be something there with Ryzen to get them to react.

Definitely... It's just interesting because if i5s and i7s become the same platform, and the biggest thing separating the two was hyperthreading, then fewer and fewer people will buy the more expensive i7s. So, it's a questionable move by Intel.

If I had to guess, I think they'll use binning and open up the thermal threshold on these chips moving forward to have substantially higher base clocks on the i7s; thus degrading the single-threaded performance differential between the two offerings.

Hopefully Intel's next generation moves i7 to 8 core and i5 to 6.

Well, the reason Intel hasn't done this already (see Skylake, Kaby Lake) is because it's physically difficult to do given the die-size of the chip.

The Broadwell-E that Ryzen is desigend to compete against doesn't have an iGPU due to space limitations on the die. However the non-E-series chips do have an iGPU.

For laptops without discrete graphics, this makes more sense; and since more and more mobile chips are being sold, it makes sense to continue the mainline chips with iGPUs. I'm not sure why Intel would be doing a deal with AMD for this though; since Intel's iGPUs are pretty good, specifically the Iris Pro, but.. I don't think we'll see 8-core mainline Intel chips for years.

But for gamers... it's not relevant since they'll buy whatever chip they think makes the most sense.

For a time, Skylake made much more sense than say, Haswell-E; and today, Haswell-E and Skylake both make more sense than Broadwell. These days, there's not much benefit at all over Broadwell-E vs Haswell-E, and Kaby Lake, for gaming, is still probably the better bet price/performance.

So I'm not sure if Intel will rush out 8-core i7s on their mainline series of chips if it means dropping the iGPU... especially given how ubiquitous the Intel driver is these days with many, many, many computers, including workstations and office machines don't have discrete graphics due to Intel's graphics integration into the CPU and off the motherboard.

I just can't see them going backwards.
 
4K G-Sync HDR at 144Hz. The search for the perfect gaming monitor could be over
https://www.pcgamesn.com/nvidia/nvidia-gsync-hdr

"
Sadly Nvidia didn’t announce the new GTX 1080 Ti in their CES press conference yesterday. But they also didn’t announce their new G-Sync HDR tech either, and yet here it is in two monitors from Asus and Acer.

The new tech is still a while off yet, but until then these are our picks for the best gaming monitors around today.



Earlier this week AMD announced their FreeSync 2 technology, along with Samsung's screen manufacturing support, which eliminates the input lag from HDR gaming and Nvidia are now promising the same thing from their G-Sync HDR update."

@Jack Brickman
@gourimoko
 
4K G-Sync HDR at 144Hz. The search for the perfect gaming monitor could be over
https://www.pcgamesn.com/nvidia/nvidia-gsync-hdr

"
Sadly Nvidia didn’t announce the new GTX 1080 Ti in their CES press conference yesterday. But they also didn’t announce their new G-Sync HDR tech either, and yet here it is in two monitors from Asus and Acer.

The new tech is still a while off yet, but until then these are our picks for the best gaming monitors around today.



Earlier this week AMD announced their FreeSync 2 technology, along with Samsung's screen manufacturing support, which eliminates the input lag from HDR gaming and Nvidia are now promising the same thing from their G-Sync HDR update."

@Jack Brickman
@gourimoko

Very interesting...

Want to see how the panel does with color reproduction.. A *lot* of the monitors I looked at before buying mine had tons of issues with accurate colors and that's a must for my line of work... That was why I went with the 100hz IPS panel.

Also, I'm not sure if I can go back to a 16:9 working area...

I think 34" and 3440x1440 is good with me.. Get that up to 144hz on an IPS panel and with minimal lag, flicker, and accurate colors and I'm down.
 
Very interesting...

Want to see how the panel does with color reproduction.. A *lot* of the monitors I looked at before buying mine had tons of issues with accurate colors and that's a must for my line of work... That was why I went with the 100hz IPS panel.

Also, I'm not sure if I can go back to a 16:9 working area...

I think 34" and 3440x1440 is good with me.. Get that up to 144hz on an IPS panel and with minimal lag, flicker, and accurate colors and I'm down.
It's not an upgrade id make until its down to like 700 dollars in 1+ years. What's cool about 4k is that its exactly 4x the resolution of 1080 p, so it's 1:1 for pixels. That means content doesn't look like smeared ass at that resolution if you really needed to dial back.
 
It's not an upgrade id make until its down to like 700 dollars in 1+ years. What's cool about 4k is that its exactly 4x the resolution of 1080 p, so it's 1:1 for pixels. That means content doesn't look like smeared ass at that resolution if you really needed to dial back.

Are you upscaling much of your content on your monitor? I only upscale when playing retro games via emulation... Can't think of the last game I played in a non-native resolution.. Any game that doesn't support 1440p I'll either patch or just won't fucking play it.. lol!!

But while 4k is cool, I think 21:9 is just a much better experience all around. For the 33% reduction in pixel density, I'll take the additional panoramic view area; especially on a curved surface.

I never in a million years thought I'd say that... but using these curved ultrawides has been an eye-opening experience... There's simply nothing like it.
 
Are you upscaling much of your content on your monitor? I only upscale when playing retro games via emulation... Can't think of the last game I played in a non-native resolution.. Any game that doesn't support 1440p I'll either patch or just won't fucking play it.. lol!!

But while 4k is cool, I think 21:9 is just a much better experience all around. For the 33% reduction in pixel density, I'll take the additional panoramic view area; especially on a curved surface.

I never in a million years thought I'd say that... but using these curved ultrawides has been an eye-opening experience... There's simply nothing like it.
Since my monitor is 1080 P I don't upscale anything. It's nice for 4k tv owners since there is barely any 4k content altogether. There isn't a console capable of rendering at a full 4k, broadcasters I think still mostly do 720p/1080i so that leaves select netflix shows and a small range of blu ray movies, or if you have a Titan pascal or SLI 1070/1080 PC.

I could most definitely play older games at 4k with a high framerate (DSR 4x or use of super sampling). I could also probably play some newer games at that res if I dialed some settings down and went with like a 30-40 frames per second cap. 60+ fps is out of the question though.

Lets say you play overwatch or CS:Go though on a 4k monitor and you require 100 + fps at all times. Dropping down to 1080 P for that isn't a bad compromise because of the 1:1. It would look very similar to 27 inch 1080 P displays that have been on the market.

Ultra wide is cool, but I feel like i'm doing too much panning of my head when i've played on one. The support still isn't that great either. CS go's scoreboard is still fucked up and overwatch I don't think even supports it because of competitive advantage reasons.
 
Since my monitor is 1080 P I don't upscale anything. It's nice for 4k tv owners since there is barely any 4k content altogether. There isn't a console capable of rendering at a full 4k, broadcasters I think still mostly do 720p/1080i so that leaves select netflix shows and a small range of blu ray movies, or if you have a Titan pascal or SLI 1070/1080 PC.

I could most definitely play older games at 4k with a high framerate (DSR 4x or use of super sampling). I could also probably play some newer games at that res if I dialed some settings down and went with like a 30-40 frames per second cap. 60+ fps is out of the question though.

Right, which, is yet another reason that I think 1440p is the sweet spot.. You're getting very high pixel density, the very large screen, and you also get the panoramic wrap-around of 21:9 curved.

4k content looks great; and can sometimes even be better, qualitatively on an ultrawide if the original anamorphic content is 21:9 (or close) which many many movies are...

If we're talking about gaming, then a 1080 can push 3440x1440 on high settings no problem; I do it everyday with a pretty reasonable overclock.

Lets say you play overwatch or CS:Go though on a 4k monitor and you require 100 + fps at all times. Dropping down to 1080 P for that isn't a bad compromise because of the 1:1. It would look very similar to 27 inch 1080 P displays that have been on the market.

Well you could do a few things:

1) Play at 1080p with black bars;
2) Render natively at 1440p (16:9)

But personally I would just play at 3440x1440 natively.. I mean, that seems like it'd be the best option, no?

Seems like a single 1080 can handle this game as it is at 4k 60FPS, so 1440p should be okay at 100 FPS.
zflhFoH.png



Ultra wide is cool, but I feel like i'm doing too much panning of my head when i've played on one.

I agree 100%

This is what kept me out of ultrawides for so long...

It wasn't until I paired ultra-wide with the aggressive curve that I felt like I found the perfect match.. Couldn't believe how well that worked!

The support still isn't that great either. CS go's scoreboard is still fucked up and overwatch I don't think even supports it because of competitive advantage reasons.

Hmm.. I can't speak to that as I don't play Overwatch, but.. in Fallout 4 for example, while official support wasn't great, it took me all of 5 mins to get a mod for the monitor that works very well.. Yes, there is an occasional glitch here and there, but, I'd rather deal with that than not having the expanded field of view.

In fact, the ultrawide curve is something I don't think I could ever give up.. and last year I'd have laughed at the idea of me using a single curved monitor as my daily driver.
 
@TyGuy

What are the competitive advantages of 16:9? Curious, coming from a competitive Halo background, but not CS..

Is this like where guys are playing 4:3 "stretched?" At some point, doesn't it make more sense to have a wider-field of view?
 
@TyGuy

What are the competitive advantages of 16:9? Curious, coming from a competitive Halo background, but not CS..

Is this like where guys are playing 4:3 "stretched?" At some point, doesn't it make more sense to have a wider-field of view?
I know some games don't do ultra wide because they feel those people would have a competitive advantage from seeing a wider view. Games like starcraft come to mind. I just know that overwatch doesn't support ultra wide and cs is sort of half ass. The actual game works, but the scoreboard is fucked when you hit tab.

As far as cs pro players. Yeah, some are weird and use these odd resolutions. Then there are others that use full native 1080 P. The only thing that matters is maintaining a high framerate. 60 frames per second isn't good enough for competitive shooters because of noticeably more input lag and being below the tick rate of the server. You would only be sending 60 updates per second while everybody else is sending 120.

Beyond that 100 + frames per second looks much smoother as you are getting more information.

For me 1440 P isn't a good buy. You get a larger display, but that can be mitigated by viewing distance. There is slightly greater pixel density in a 27 inch 1440 p monitor over a 24 inch 1080 p monitor. Throw in that it's easier to get desirable framerates at 1080 p and support for nvidia 3d vision and it's just a more compelling package for me. There is not a single pro fps gamer that i'm aware of that plays on anything but a 144 hz 24 inch display regardless of what resolution they run it at.

Now, 4k at 27 inch is a HUGE upgrade to visuals because of the massive increase in pixel density.

I see why a lot of people like ultra wide. Particularly for work related content. For gaming though I feel like 16 by 9 is my sweet spot. I don't have to pan my head to see all the information. My girlfriends is curved too.
 

Rubber Rim Job Podcast Video

Episode 3-14: "Time for Playoff Vengeance on Mickey"

Rubber Rim Job Podcast Spotify

Episode 3:14: " Time for Playoff Vengeance on Mickey."
Top