MojoKid writes Dell's Alienware division recently released a radical redesign of their Area-51 gaming desktop. With 45-degree angled front and rear face plates that are designed to direct control and IO up toward the user, in addition to better directing cool airflow in, while warm airflow is directed up and away from the rear of the chassis, this triangular-shaped machine grabs your attention right away. In testing and benchmarks, the Area-51's new design enables top-end performance with thermal and acoustic profiles that are fairly impressive versus most high-end gaming PC systems. The chassis design is also pretty clean, modular and easily servicable. Base system pricing isn't too bad, starting at $1699 with the ability to dial things way up to an 8-core Haswell-E chip and triple GPU graphics from NVIDIA and AMD. The test system reviewed at HotHardware was powered by a six-core Core i7-5930K chip and three GeForce GTX 980 cards in SLI. As expected, it ripped through the benchmarks, though the price as configured and tested is significantly higher.
Slashdot stories can be listened to in audio form via an RSS feed, as read by our own robotic overlord.
An anonymous reader writes: AMD recently presented plans to unify their open-source and Catalyst Linux drivers at the open source XDC2014 conference in France. NVIDIA's rebuttal presentation focused on support Mir and Wayland on Linux. The next-generation display stacks are competing to succeed the X.Org Server. NVIDIA is partially refactoring their Linux graphics driver to support EGL outside of X11, to propose new EGL extensions for better driver interoperability with Wayland/Mir, and to support the KMS APIs by their driver. NVIDIA's binary driver will support the KMS APIs/ioctls but will be using their own implementation of kernel mode-setting. The EGL improvements are said to land in their closed-source driver this autumn while the other changes probably won't be seen until next year.
MojoKid (1002251) writes A new interview with Assassin's Creed Unity senior producer Vincent Pontbriand has some gamers seeing red and others crying "told you so," after the developer revealed that the game's 900p framerate and 30 fps target on consoles is a result of weak CPU performance rather than GPU compute. "Technically we're CPU-bound," Pontbriand said. "The GPUs are really powerful, obviously the graphics look pretty good, but it's the CPU that has to process the AI, the number of NPCs we have on screen, all these systems running in parallel. We were quickly bottlenecked by that and it was a bit frustrating, because we thought that this was going to be a tenfold improvement over everything AI-wise..." This has been read by many as a rather damning referendum on the capabilities of AMD's APU that's under the hood of Sony's and Microsoft's new consoles. To some extent, that's justified; the Jaguar CPU inside both the Sony PS4 and Xbox One is a modest chip with a relatively low clock speed. Both consoles may offer eight CPU threads on paper, but games can't access all that headroom. One thread is reserved for the OS and a few more cores will be used for processing the 3D pipeline. Between the two, Ubisoft may have only had 4-5 cores for AI and other calculations — scarcely more than last gen, and the Xbox 360 and PS3 CPUs were clocked much faster than the 1.6 / 1.73GHz frequencies of their replacements.
An anonymous reader writes: AMD is moving forward with their plans to develop a new open-source Linux driver model for their Radeon and FirePro graphics processors. Their unified Linux driver model is moving forward, albeit slightly different compared to what was planned early this year. They're now developing a new "AMDGPU" kernel driver to power both the open and closed-source graphics components. This new driver model will also only apply to future generations of AMD GPUs. Catalyst is not being open-sourced, but will be a self-contained user-space blob, and the DRM/libdrm/DDX components will be open-source and shared. This new model is more open-source friendly, places greater emphasis on their mainline kernel driver, and should help Catalyst support Mir and Wayland.
An anonymous reader writes Counter-Strike: Global Offensive has finally been released for Linux two years after its Windows debut. The game is reported to work even on the open-source Intel Linux graphics drivers, but your mileage may vary. When it comes to the AMD and NVIDIA drivers, NVIDIA continues dominating for Linux gaming over AMD with Catalyst where there's still performance levels and other OpenGL issues.
MojoKid (1002251) writes NVIDIA has launched two new high-end graphics cards based on their latest Maxwell architecture. The GeForce GTX 980 and GTX 970 are based on Maxwell and replace NVIDIA's current high-end offerings, the GeForce GTX 780 Ti, GTX 780, and GTX 770. NVIDIA's GeForce GTX 980 and GTX 970 are somewhat similar as the cards share the same 4GB frame buffer and GM204 GPU, but the GTX 970's GPU is clocked a bit lower and features fewer active Streaming Multiprocessors and CUDA cores. The GeForce GTX 980's GM204 GPU has all of its functional blocks enabled. The fully-loaded GeForce GTX 980 GM204 GPU has a base clock of 1126MHz and a Boost clock of 1216MHz. The GTX 970 clocks in with a base clock of 1050MHz and Boost clock of 1178MHz. The 4GB of video memory on both cards is clocked at a blisteringly-fast 7GHz (effective GDDR5 data rate). NVIDIA was able to optimize the GM204's power efficiency, however, by tweaking virtually every part of the GPU. NVIDIA claims that Maxwell SMs (Streaming Multiprocessors) offer double the performance of GK104 and double the perf per watt as well. NVIDIA has also added support for new features, namely Dynamic Super Resolution (DSR), Multi-Frame Sampled Anti-Aliasing (MFAA), and Voxel Global Illumination (VXGI). Performance-wise, the GeForce GTX 980 is the fastest single-GPU powered graphics card ever tested. The GeForce GTX 970 isn't as dominant overall, but its performance was impressive nonetheless. The GeForce GTX 970 typically performed about on par with a GeForce GTX Titan and traded blows with the Radeon R9 290X.
Vigile (99919) writes AMD looks to continue addressing the mainstream PC enthusiast and gamer with a set of releases into two different component categories. First, today marks the launch of the Radeon R9 285 graphics card, a $250 option based on a brand new piece of silicon dubbed Tonga. This GPU has nearly identical performance to the R9 280 that came before it, but includes support for XDMA PCIe CrossFire, TrueAudio DSP technology and is FreeSync capable (AMD's response to NVIDIA G-Sync). On the CPU side AMD has refreshed its FX product line with three new models (FX-8370, FX-8370e and FX-8320e) with lower TDPs and supposedly better efficiency. The problem of course is that while Intel is already sampling 14nm parts these Vishera-based CPUs continue to be manufactured on GlobalFoundries' 32nm process. The result is less than expected performance boosts and efficiency gains. For a similar review of the new card, see Hot Hardware's page-by-page unpacking.
New submitter nrjperera (2669521) submits news of a new laptop from HP that's in Chromebook (or, a few years ago, "netbook") territory, price-wise, but loaded with Windows 8.1 instead. Microsoft has teamed up with HP to make an affordable Windows laptop to beat Google Chromebooks at their own game. German website Mobile Geeks have found some leaked information about this upcoming HP laptop dubbed Stream 14, including its specifications. According to the leaked data sheet the HP Stream 14 laptop will share similar specs to HP's cheap Chromebook. It will be shipped with an AMD A4 Micro processor, 2GB of RAM, 32GB of flash storage and a display with 1,366 x 768 screen resolution. Microsoft will likely offer 100GB of OneDrive cloud storage with the device to balance the limited storage option.
MojoKid (1002251) writes AMD is launching a new family of products today, but unless you follow the rumor mill closely, it's probably not something you'd expect. It's not a new CPU, APU, or GPU. Today, AMD is launching its first line of solid state drives (SSDs), targeted squarely at AMD enthusiasts. AMD is calling the new family of drives, the Radeon R7 Series SSD, similar to its popular mid-range line of graphics cards. The new Radeon R7 Series SSDs feature OCZ and Toshiba technology, but with a proprietary firmware geared towards write performance and high endurance. Open up one of AMD's new SSDs and you'll see OCZ's Indilinx Barefoot 3 M00 controller on board—the same controller used in the OCZ Vector 150, though it is clocked higher in these drives. That controller is paired to A19nm Toshiba MLC (Multi-Level Cell) NAND flash memory and a DDR3-1333MHz DRAM cache. The 120GB and 240GB drives sport 512MB of cache memory, while the 480GB model will be outfitted with 1GB. Interestingly enough, AMD Radeon R7 Series SSDs are some of the all-around, highest-performing SATA SSDs tested to date. IOPS performance is among the best seen in a consumer-class SSD, write throughput and access times are highly-competitive across the board, and the drive offered consistent performance regardless of the data type being transferred. Read performance is also strong, though not quite as stand-out as write performance.
Lucas123 writes An AMD website in China has leaked information about the upcoming release of a line of SSDs aimed at gamers and professionals that will offer top sequential read/write speeds of 550MB/s and 530MB/s, respectively. AMD confirmed the upcoming news, but no pricing was available yet. The SSDs will come in 120GB, 240GB and 480GB capacities and will use Toshiba's 19-nanometer flash lithography technology. According to IHS, AMD is likely entering the gaming SSD market because desktop SSD shipments are expected to experience a 39% CAGR between now and 2018.
MojoKid (1002251) writes "AMD updated its family of Kaveri-based A-Series APUs for desktop systems recently, namely the A10-7800 and the A6-7400K. The A10-7800 has 12 total compute cores, 4 CPU and 8 GPU cores, with average and maximum turbo clock speeds of 3.5GHz and 3.9GHz, respectively. The A6-7400K arrives with 6 total cores (2CPU, 4 GPU) and with the same clock frequencies. ... The AMD A10-7800 APU's performance is somewhat mixed, though it is a decent performer overall. Its Steamroller-based CPU cores do not do much to make up ground versus Intel's processors, so in the more CPU-bound workloads, Intel's dual-core Core i3-4330 competes favorably to AMD's quad-cores. And in terms of IPC and single-thread performance Intel maintains a big lead. Factor graphics into the equation, however, and the tides turn completely. The GCN-based graphics engine in Kaveri is a major step-up over the previous-gen, and much more powerful than Intel's mainstream offerings. The A10-7800's power consumption characteristics are also more desirable versus the Richland-based A10-6800K."
Dputiger (561114) writes "It has been almost two years since AMD launched the FirePro W9000 and kicked off a heated battle in the workstation GPU wars with NVIDIA. AMD recently released the powerful FirePro W9100, however, a new card based on the same Hawaii-class GPU as the desktop R9 290X, but aimed at the professional workstation market. The W9100's GPU features 2,816 stream processors, and the card boasts 320GB/s of memory bandwidth, and six mini-DisplayPorts, all of which support DP1.2 and 4K output. The W9100 carries more RAM than any other AMD GPU as well, a whopping 16GB of GDDR5 on a single card. Even NVIDIA's top-end Quadro K6000 tops out at 12GB, which means AMD sits in a class by itself in this area. In terms of performance, this review shows that the FirePro W9100 doesn't always outshine its competition, but its price/performance ratio keep it firmly in the running. But if AMD continues to improve its product mix and overall software support, it should close the gap even more in the pro GPU market in the next 18-24 months."
MojoKid writes: "When NSA whistleblower Edward Snowden came forth last year with U.S. government spying secrets, it didn't take long to realize that some of the information revealed could bring on serious repercussions — not just for the U.S. government, but also for U.S.-based companies. The latest to feel the hit? None other than Apple, and in a region the company has been working hard to increase market share: China. China, via state media, has today declared that Apple's iPhone is a threat to national security — all because of its thorough tracking capabilities. It has the ability to keep track of user locations, and to the country, this could potentially reveal "state secrets" somehow. It's being noted that the iPhone will continue to track the user to some extent even if the overall feature is disabled. China's iPhone ousting comes hot on the heels of Russia's industry and trade deeming AMD and Intel processors to be untrustworthy. The nation will instead be building its own ARM-based "Baikal" processor.
redletterdave (2493036) notes that Taiwan Semiconductor Manufacturing Co. (TSMC) has shipped its first batch of microprocessors to Apple as the iPhone maker looks to diversify its overseas suppliers. Apple will continue to rely on Samsung for its microprocessors, but as the rivalry between Apple and Samsung heats up in the mobile and soon wearable arenas, the deal with TSMC allows Apple to be less reliant on Samsung and therefore have more leverage with respect to price negotiations for future chips, as TSMC has supplanted Samsung Electronics as Apple's chief chipmaker for iPhones and iPads. Since 2011, Apple has been striking deals with other display and chip makers around Asia to reduce its dependence on Samsung. As a result of this slowdown in sales, Samsung on Monday announced operating income for its fiscal second quarter had sunk to a two-year low, blaming 'weak' sales of low- and medium-end smartphones, strong competition and subpar demand.
It may not be a household name like Intel or AMD, but TSMC is the world's biggest chip maker by revenue.
It may not be a household name like Intel or AMD, but TSMC is the world's biggest chip maker by revenue.
An anonymous reader writes with this news from Tass: Russia's Industry and Trade Ministry plans to replace U.S. microchips (Intel and AMD), used in government's computers, with domestically-produced micro Baikal processors in a project worth dozens of millions of dollars, business daily Kommersant reported Thursday. The article is fairly thin, but does add a bit more detail: "The Baikal micro processor will be designed by a unit of T-Platforms, a producer of supercomputers, next year, with support from state defense conglomerate Rostec and co-financing by state-run technological giant Rosnano. The first products will be Baikal M and M/S chips, designed on the basis of 64-bit nucleus Cortex A-57 made by UK company ARM, with frequency of 2 gigahertz for personal computers and micro servers."
An anonymous reader writes 4K monitor prices have fallen into the range where mainstream consumers are starting to consider them for work and for play. There are enough models that we can compare and contrast, and figure out which are the best of the ones available. But this report at The Wirecutter makes the case that absent a pressing need for 8.29 million pixels, you should just wait before buying one. They say, "The current version of the HDMI specification (1.4a) can only output a 4096×2160 resolution at a refresh rate of 24 Hz or 3840×2160 at 30 Hz—the latter, half that of what we're used to on TVs and monitors. Connect up a 4K monitor at 30 Hz via HDMI and you'll see choppier animations and transitions in your OS. You might also encounter some visible motion stuttering during normal use, and you'll be locked to a maximum of 30 frames per second for your games—it's playable, but not that smooth. ... Most people don't own a system that's good enough for gaming on a 4K display—at least, not at highest-quality settings. You'll be better off if you just plan to surf the Web in 4K: Nvidia cards starting in the 600 series and AMD Radeon HD 6000 and 7000-series GPUs can handle 4K, as can systems built with integrated Intel HD 4000 graphics or AMD Trinity APUs. ... There's a light on the horizon. OS support will strengthen, connection types will be able to handle 4K displays sans digital tricks, and prices will drop as more 4K displays hit the market. By then, there will even be more digital content to play on a 4K display (if gaming or multitasking isn't your thing), and 4K monitors will even start to pull in fancier display technology like Nvidia's G-Sync for even smoother digital shootouts."
An anonymous reader writes "Phoronix last week tested 65 graphics cards on open source drivers under Linux and the best result was generally with the open source AMD Radeon drivers. This week they put out a 35-graphics-card comparison using the proprietary AMD/NVIDIA drivers (with the other 30 cards being too old for the latest main drivers) under Ubuntu 14.04. The winner for proprietary GPU driver support on Linux was NVIDIA, which shouldn't come as much of a surprise given that Valve and other Linux game developers are frequently recommending NVIDIA graphics for their game titles while AMD Catalyst support doesn't usually come to games until later. The Radeon OpenGL performance with Catalyst had some problems, but at least its performance per Watt was respectable. Open-source fans are encouraged to use AMD hardware on Linux while those just wanting the best performance and overall experience should see NVIDIA with their binary driver."
Vigile writes: NVIDIA announced its latest dual-GPU flagship card, the GeForce GTX Titan Z, at the GPU Technology Conference in late March with a staggering price point of $2999. Since that time, AMD announced and released the Radeon R9 295X2, its own dual-GPU card with a price tag of $1499. PC Perspective finally put the GTX Titan Z to the test and found that from a PC gamer's view, the card is way overpriced for the performance it offers. At both 2560x1440 and 3840x2160 (4K), the R9 295X2 offered higher and more consistent frame rates, sometimes by as much as 30%. The AMD card also only takes up two slots (though it does have a water cooling radiator to worry about) while the NVIDIA GTX Titan Z is a three-slot design. The Titan Z is quieter and uses much less power, but gamers considering a $1500 or $3000 graphics card selection are likely not overly concerned with power efficiency.
An anonymous reader writes "How good are open source graphics drivers in 2014 given all the Linux gaming and desktop attention? Phoronix has tested 65 different GPUs using the latest open source drivers covering Intel HD Graphics, NVIDIA GeForce, AMD Radeon, and AMD FirePro hardware. Of the 65 GPUs tested, only 50 of them had good enough open source driver support for running OpenGL games and benchmarks. Across the NVIDIA and AMD hardware were several pages of caveats with different driver issues encountered on Linux 3.15 and Mesa 10.3 loaded on Ubuntu 14.04. Intel graphics on Linux were reliable but slow while AMD's open-source Linux support was recommended over the NVIDIA support that doesn't currently allow for suitable graphics card re-clocking. Similar tests are now being done with the proprietary Linux drivers."