Forgot your password?
typodupeerror

Follow Slashdot stories on Twitter

Graphics

AMD FirePro W9100 16GB Workstation GPU Put To the Test 42

Posted by Unknown Lamer
from the more-power dept.
Dputiger (561114) writes "It has been almost two years since AMD launched the FirePro W9000 and kicked off a heated battle in the workstation GPU wars with NVIDIA. AMD recently released the powerful FirePro W9100, however, a new card based on the same Hawaii-class GPU as the desktop R9 290X, but aimed at the professional workstation market. The W9100's GPU features 2,816 stream processors, and the card boasts 320GB/s of memory bandwidth, and six mini-DisplayPorts, all of which support DP1.2 and 4K output. The W9100 carries more RAM than any other AMD GPU as well, a whopping 16GB of GDDR5 on a single card. Even NVIDIA's top-end Quadro K6000 tops out at 12GB, which means AMD sits in a class by itself in this area. In terms of performance, this review shows that the FirePro W9100 doesn't always outshine its competition, but its price/performance ratio keep it firmly in the running. But if AMD continues to improve its product mix and overall software support, it should close the gap even more in the pro GPU market in the next 18-24 months."
China

Chinese State Media Declares iPhone a Threat To National Security 143

Posted by Soulskill
from the fruit-ninja-must-have-cause-a-lot-of-traffic-deaths dept.
MojoKid writes: "When NSA whistleblower Edward Snowden came forth last year with U.S. government spying secrets, it didn't take long to realize that some of the information revealed could bring on serious repercussions — not just for the U.S. government, but also for U.S.-based companies. The latest to feel the hit? None other than Apple, and in a region the company has been working hard to increase market share: China. China, via state media, has today declared that Apple's iPhone is a threat to national security — all because of its thorough tracking capabilities. It has the ability to keep track of user locations, and to the country, this could potentially reveal "state secrets" somehow. It's being noted that the iPhone will continue to track the user to some extent even if the overall feature is disabled. China's iPhone ousting comes hot on the heels of Russia's industry and trade deeming AMD and Intel processors to be untrustworthy. The nation will instead be building its own ARM-based "Baikal" processor.
Businesses

Apple Gets Its First Batch of iPhone Chips From TSMC 45

Posted by timothy
from the pronounced-just-like-it-looks dept.
redletterdave (2493036) notes that Taiwan Semiconductor Manufacturing Co. (TSMC) has shipped its first batch of microprocessors to Apple as the iPhone maker looks to diversify its overseas suppliers. Apple will continue to rely on Samsung for its microprocessors, but as the rivalry between Apple and Samsung heats up in the mobile and soon wearable arenas, the deal with TSMC allows Apple to be less reliant on Samsung and therefore have more leverage with respect to price negotiations for future chips, as TSMC has supplanted Samsung Electronics as Apple's chief chipmaker for iPhones and iPads. Since 2011, Apple has been striking deals with other display and chip makers around Asia to reduce its dependence on Samsung. As a result of this slowdown in sales, Samsung on Monday announced operating income for its fiscal second quarter had sunk to a two-year low, blaming 'weak' sales of low- and medium-end smartphones, strong competition and subpar demand.
It may not be a household name like Intel or AMD, but TSMC is the world's biggest chip maker by revenue.
AMD

Russia Wants To Replace US Computer Chips With Local Processors 340

Posted by timothy
from the domestic-production dept.
An anonymous reader writes with this news from Tass: Russia's Industry and Trade Ministry plans to replace U.S. microchips (Intel and AMD), used in government's computers, with domestically-produced micro Baikal processors in a project worth dozens of millions of dollars, business daily Kommersant reported Thursday. The article is fairly thin, but does add a bit more detail: "The Baikal micro processor will be designed by a unit of T-Platforms, a producer of supercomputers, next year, with support from state defense conglomerate Rostec and co-financing by state-run technological giant Rosnano. The first products will be Baikal M and M/S chips, designed on the basis of 64-bit nucleus Cortex A-57 made by UK company ARM, with frequency of 2 gigahertz for personal computers and micro servers."
Displays

4K Monitors: Not Now, But Soon 186

Posted by Soulskill
from the wait-for-16K dept.
An anonymous reader writes 4K monitor prices have fallen into the range where mainstream consumers are starting to consider them for work and for play. There are enough models that we can compare and contrast, and figure out which are the best of the ones available. But this report at The Wirecutter makes the case that absent a pressing need for 8.29 million pixels, you should just wait before buying one. They say, "The current version of the HDMI specification (1.4a) can only output a 4096×2160 resolution at a refresh rate of 24 Hz or 3840×2160 at 30 Hz—the latter, half that of what we're used to on TVs and monitors. Connect up a 4K monitor at 30 Hz via HDMI and you'll see choppier animations and transitions in your OS. You might also encounter some visible motion stuttering during normal use, and you'll be locked to a maximum of 30 frames per second for your games—it's playable, but not that smooth. ... Most people don't own a system that's good enough for gaming on a 4K display—at least, not at highest-quality settings. You'll be better off if you just plan to surf the Web in 4K: Nvidia cards starting in the 600 series and AMD Radeon HD 6000 and 7000-series GPUs can handle 4K, as can systems built with integrated Intel HD 4000 graphics or AMD Trinity APUs. ... There's a light on the horizon. OS support will strengthen, connection types will be able to handle 4K displays sans digital tricks, and prices will drop as more 4K displays hit the market. By then, there will even be more digital content to play on a 4K display (if gaming or multitasking isn't your thing), and 4K monitors will even start to pull in fancier display technology like Nvidia's G-Sync for even smoother digital shootouts."
AMD

NVIDIA Is Better For Closed-Source Linux GPU Drivers, AMD Wins For Open-Source 185

Posted by Soulskill
from the best-of-different-worlds dept.
An anonymous reader writes "Phoronix last week tested 65 graphics cards on open source drivers under Linux and the best result was generally with the open source AMD Radeon drivers. This week they put out a 35-graphics-card comparison using the proprietary AMD/NVIDIA drivers (with the other 30 cards being too old for the latest main drivers) under Ubuntu 14.04. The winner for proprietary GPU driver support on Linux was NVIDIA, which shouldn't come as much of a surprise given that Valve and other Linux game developers are frequently recommending NVIDIA graphics for their game titles while AMD Catalyst support doesn't usually come to games until later. The Radeon OpenGL performance with Catalyst had some problems, but at least its performance per Watt was respectable. Open-source fans are encouraged to use AMD hardware on Linux while those just wanting the best performance and overall experience should see NVIDIA with their binary driver."
AMD

$3000 GeForce GTX TITAN Z Tested, Less Performance Than $1500 R9 295X2 151

Posted by Soulskill
from the spending-a-lot-of-green-for-team-green dept.
Vigile writes: NVIDIA announced its latest dual-GPU flagship card, the GeForce GTX Titan Z, at the GPU Technology Conference in late March with a staggering price point of $2999. Since that time, AMD announced and released the Radeon R9 295X2, its own dual-GPU card with a price tag of $1499. PC Perspective finally put the GTX Titan Z to the test and found that from a PC gamer's view, the card is way overpriced for the performance it offers. At both 2560x1440 and 3840x2160 (4K), the R9 295X2 offered higher and more consistent frame rates, sometimes by as much as 30%. The AMD card also only takes up two slots (though it does have a water cooling radiator to worry about) while the NVIDIA GTX Titan Z is a three-slot design. The Titan Z is quieter and uses much less power, but gamers considering a $1500 or $3000 graphics card selection are likely not overly concerned with power efficiency.
Graphics

Testing 65 Different GPUs On Linux With Open Source Drivers 134

Posted by Soulskill
from the line-'em-up-and-knock-'em-down dept.
An anonymous reader writes "How good are open source graphics drivers in 2014 given all the Linux gaming and desktop attention? Phoronix has tested 65 different GPUs using the latest open source drivers covering Intel HD Graphics, NVIDIA GeForce, AMD Radeon, and AMD FirePro hardware. Of the 65 GPUs tested, only 50 of them had good enough open source driver support for running OpenGL games and benchmarks. Across the NVIDIA and AMD hardware were several pages of caveats with different driver issues encountered on Linux 3.15 and Mesa 10.3 loaded on Ubuntu 14.04. Intel graphics on Linux were reliable but slow while AMD's open-source Linux support was recommended over the NVIDIA support that doesn't currently allow for suitable graphics card re-clocking. Similar tests are now being done with the proprietary Linux drivers."
AMD

AMD, NVIDIA, and Developers Weigh In On GameWorks Controversy 80

Posted by Soulskill
from the there-can-be-only-one-(or-more) dept.
Dputiger writes: "Since NVIDIA debuted its GameWorks libraries there's been allegations that they unfairly disadvantaged AMD users or prevented developers from optimizing code. We've taken these questions to developers themselves and asked them to weigh in on how games get optimized, why NVIDIA built this program, and whether its an attempt to harm AMD customers. 'The first thing to understand about [developer/GPU manufacturer] relations is that the process of game optimization is nuanced and complex. The reason AMD and NVIDIA are taking different positions on this topic isn't because one of them is lying, it’s because AMD genuinely tends to focus more on helping developers optimize their own engines, while NVIDIA puts more effort into performing tasks in-driver. This is a difference of degree — AMD absolutely can perform its own driver-side optimization and NVIDIA's Tony Tamasi acknowledged on the phone that there are some bugs that can only be fixed by looking at the source. ... Some of this difference in approach is cultural but much of it is driven by necessity. In 2012 (the last year before AMD's graphics revenue was rolled into the console business), AMD made about $1.4 billion off the Radeon division. For the same period, NVIDIA made more than $4.2 billion. Some of that was Tegra-related and it's a testament to AMD's hardware engineering that it competes effectively with Nvidia with a much smaller revenue share, but it also means that Team Green has far more money to spend on optimizing every aspect of the driver stack.'"
AMD

AMD and NVIDIA Trade Allegations, Denials Over Shady Tactics 69

Posted by Soulskill
from the can't-we-all-just-get-along dept.
crookedvulture writes "In an article published by Forbes earlier this week, AMD lashed out at NVIDIA's GameWorks program, which includes Watch Dogs and other popular titles, such as Call of Duty: Ghosts, Assassin's Creed IV, and Batman: Arkham Origins. Technical communications lead for PC graphics Robert Hallock alleged that GameWorks deliberately cripples performance on AMD hardware. He also claimed that developers are prevented from working with AMD on game optimizations. The Forbes piece was fairly incriminating, but it didn't include any commentary from the other side of the fence. NVIDIA has now responded to the allegations, and as one might expect, it denies them outright. Director of engineering for developer technology Cem Cebenoyan says NVIDIA has never barred developers from working with AMD. In fact, he claims that AMD's own developer relations efforts have prevented NVIDIA from getting its hands on early builds of some games. AMD has said in the past that it makes no effort to prevent developers from working with NVIDIA. So, we have another round of he said, she said, with gamers caught in the middle and performance in newer titles hanging in the balance."
Graphics

Mesa 10.2 Will Feature Better Adreno Driver, OpenMAX, Cherryview Support 21

Posted by Unknown Lamer
from the windows-now-10%-wobblier dept.
Via Phoronix comes news that Mesa 10.2 will be released in a few days with several interesting new features. Highlights include OpenGL 2.1 support for Freedreno (the driver for the Qualcomm graphics chips), video encoding and decoding on GCN Radeons using the new OpenMAX state tracker, and initial support for Intel's upcoming Cherryview Atom SoC. Progress is being made toward OpenGL 4 support, and the llvmpipe software rasterizer finally supports OpenGL 3.2. The release won't feature a few things: the Intel Sandybridge driver still does not support OpenGL 3.3, the R9 290 Radeons are still not working (despite claims by AMD a couple of years ago that cards starting with the Radeon 8000 series would be supported by the Free Software driver at hardware release time), and OpenCL support is still experimental.
Graphics

Haiku Gains Support For Current Radeon HD Cards 70

Posted by timothy
from the old-and-new-together dept.
As reported by Phoronix, the Haiku operating system "has added (untested) support for the newest AMD Radeon graphics cards to its open-source driver for the BeOS-compatible operating system." (Specifically, that support is for the "Mullins" and "Hawaii" graphics processors.) Impressive that this project keeps the BeOS flag raised and continues to modernize; Haiku has been around since 2001 — years longer than Be, Inc. itself lasted.
AMD

AMD Preparing To Give Intel a Run For Its Money 345

Posted by Soulskill
from the saddle-up dept.
jfruh writes: "AMD has never been able to match Intel for profits or scale, but a decade ago it was in front on innovation — the first to 1GHz, the first to 64-bit, the first to dual core. A lack of capital has kept the company barely holding on with cheap mid-range chips since; but now AMD is flush with cash from its profitable business with gaming consoles, and is preparing an ambitious new architecture for 2016, one that's distinct from the x86/ARM hybrid already announced."
Displays

Standards Group Adds Adaptive-Sync To DisplayPort 82

Posted by Unknown Lamer
from the variable-framerate-considered-alright dept.
MojoKid (1002251) writes "Over the past nine months, we've seen the beginnings of a revolution in how video games are displayed. First, Nvidia demoed G-Sync, its proprietary technology for ensuring smooth frame delivery. Then AMD demoed its own free standard, dubbed FreeSync, that showed a similar technology. Now, VESA (Video Electronics Standard Association) has announced support for "Adaptive Sync," as an addition to DisplayPort. The new capability will debut with DisplayPort 1.2a. The goal of these technologies is to synchronize output from the GPU and the display to ensure smooth output. When this doesn't happen, the display will either stutter due to a mismatch of frames (if V-Sync is enabled) or may visibly tear if V-Sync is disabled. Adaptive Sync is the capability that will allow a DisplayPort 1.2a-compatible monitor and video card to perform FreeSync without needing the expensive ASIC that characterizes G-Sync. You'll still need a DP1.2a cable, monitor, and video card (DP1.2a monitors are expected to ship year end). Unlike G-Sync, a DP1.2a monitor shouldn't cost any additional money, however. The updated ASICs being developed by various vendors will bake the capability in by default."
Graphics

The Truth About OpenGL Driver Quality 158

Posted by Unknown Lamer
from the when-standards-aren't dept.
rcht148 (2872453) writes "Rich Geldreich (game/graphics programmer) has made a blog post on the quality of different OpenGL Drivers. Using anonymous titles (Vendor A: Nvidia; Vendor B: AMD; Vendor C: Intel), he plots the landscape of game development using OpenGL. Vendor A, jovially known as 'Graphics Mafia' concentrates heavily on performance but won't share its specifications, thus blocking any open source driver implementations as much as possible. Vendor B has the most flaky drivers. They have good technical know-how on OpenGL but due to an extremely small team (money woes), they have shoddy drivers. Vendor C is extremely rich. It had not taken graphics seriously until a few years ago. They support open source specifications/drivers wholeheartedly but it will be few years before their drivers come to par with market standards. He concludes that using OpenGL is extremely difficult and without the blessings of these vendors, it's nearly impossible to ship a major gaming title."
AMD

AMD Designing All-New CPU Cores For ARMv8, X86 181

Posted by samzenpus
from the brand-new dept.
crookedvulture (1866146) writes "AMD just revealed that it has two all-new CPU cores in the works. One will be compatible with the 64-bit ARMv8 instruction set, while the other is meant as an x86 replacement for the Bulldozer architecture and its descendants. Both cores have been designed from the ground up by a team led by Jim Keller, the lead architect behind AMD's K8 architecture. Keller worked at Apple on the A4 and A4 before returning to AMD in 2012. The first chips based on the new AMD cores are due in 2016."
AMD

Mini Gaming PCs — Promising, But Not Ready 83

Posted by Soulskill
from the call-me-when-it-fits-inside-a-chromecast dept.
An anonymous reader writes "Ars has reviewed an AMD-powered mini gaming rig made by Gigabyte. The box itself is small and solid, and it runs a pretty beefy video card for its size. The manufacturer even claims Linux support, though the device ships with Windows 8.1. Unfortunately, reality lags a bit behind their plans — Ubuntu boots OK, but driver support is a mess. SteamOS won't run at all. The box is also limited by a mediocre CPU, which is itself limited by heat and power constraints. The review says the machine was 'intriguing and frustrating in equal measure' because 'its ambition is rarely matched by its execution.' It concludes: 'With some time and some different components, a little desktop that can deliver a great gaming experience will surely follow.'"
AMD

AMD Beema and Mullins Low Power 2014 APUs Tested, Faster Than Bay Trail 66

Posted by timothy
from the make-'em-fight dept.
MojoKid (1002251) writes "AMD has just announced their upcoming mainstream, low-power APUs (Accelerated Processing Units), codenames Beema and Mullins. These APUs are the successors to last year's Temash and Kabini APUs, which powered an array of small form factor and mobile platforms. Beema and Mullins are based on the same piece of silicon, but will target different market segments. Beema is the mainstream part that will find its way into affordable notebook, small form factor systems, and mobile devices. Mullins, however, is a much lower-power derivative, designed for tablets and convertible systems. They are full SoCs with on-die memory controllers, PCI Express, SATA, and USB connectivity, and a host of other IO blocks. AMD is announcing four Beema-based mainstream APUs today, with TDPs ranging from 10W – 15W. There are three Mullins-based products being announced, two quad-cores and a dual-core. The top of the line-up is the A10 Micro-6700T. It's a quad-core chip, with a max clock speed of 2.2GHz, 2MB of L2, and a TDP of only 4.5W. In the benchmarks, the A10-6700T quad core is actually able to surpass Intel's Bay Trail Atom platform pretty easily across a number of tests, especially gaming and graphics."
AMD

AMD Not Trying To Get Its Chips Into Low-Cost Tablets 87

Posted by samzenpus
from the not-interested dept.
jfruh (300774) writes "While Intel is going after low-end Android tablets in a big way chipmaking x86 rival AMD is taking a more judicious approach, looking to focus on the high end. 'This idea of contra revenue is foreign to us,' said AMD's CEO, referring to Intel's strategy of selling chips at a loss to boost market share. But will Intel's vast resources keep AMD in its niche?"
AMD

AMD Unveils the Liquid-Cooled, Dual-GPU Radeon R9 295X2 At $1,500 146

Posted by timothy
from the for-$1500-it-should-unveil-itself dept.
wesbascas (2475022) writes "This morning, AMD unveiled its latest flagship graphics board: the $1,500, liquid-cooled, dual-GPU Radeon R9 295X2. With a pair of Hawaii GPUs that power the company's top-end single-GPU Radeon R9 290X, the new board is sure to make waves at price points that Nvidia currently dominates. In gaming benchmarks, the R9 295X2 performs pretty much in line with a pair of R9 290X cards in CrossFire. However, the R9 295X2 uses specially-binned GPUs which enable the card to run with less power than a duo of the single-GPU cards. Plus, thanks to the closed-loop liquid cooler, the R9 295X doesn't succumb to the nasty throttling issues present on the R9 290X, nor its noisy solution."

Do molecular biologists wear designer genes?

Working...