SweetFX Settings DB
Latest forum threads
43 minutes ago
47 minutes ago
51 minutes ago
by henry1
1 hour, 1 minute ago
Make SweetFX LumaSharpen run on desktop?
Posted 8 years, 1 month ago
Unlike my previous BenQ monitor which had sharpness settings as an option, my new monitor lacks it. The only thing it has is sharpness options for if you are using an analog (VGA) cable. I bought a G2260VWQ6 monitor recently which is made by AOC, it's great in many way including cost, but I remember how much better having sharpness turned up made the BenQ look (which was also a lower resolution monitor, but appeared to have superior clarity due to sharpness). Once I after a couple months of owning this monitor tried out SweetFX, and realized I could turn up sharpness with the LumaSharpness setting, I immediately wanted this effect to run at all times when using my PC. Is it at all possible to make this happen? Would it require a separately written program if it is possible to do? I'm not sure if such a program exists. The only advice I kind find out there is stuff about ClearType, and other things like Contrast & Brightness that I've long since configured optimally and that don't help solve this sharpness issue. I tried out SMAA and Vibrance, and there might be times I'd want to play around with them along with other options (SMAA + Bioshock 1 & 2 is a situation where it's nice to have it as an option, since the old releases had no AA in the video options) for now I simply install this for the LumaSharpen, and set #define sharp_strength 1.00 (still playing around to find what seems to be the optimal setting), with #define sharp_clamp set to default. I even got it to work on the emulator Higan and found that SNES games looked a LOT clearer, less blurry, much more satisfying. What I found with the emulator though is you can't turn it off and on, the emulator needs to be restarted and you have to add or remove the files in the folder to change the setting. I proved it was working though with screenshots. It might be that the desktop is already sharpened, but I'm not sure about that. Screenshots I take of games seem to come out looking about right though, so I don't know. Games at least seem to run sharpened in windowed mode, but that might only mean that region of the screen is being sharpened. Essentially I find the screen to look blurry, especially in games, with LumaSharpen and on my last monitor with sharpness turned up, many details in textures show up that are otherwise diffused behind a layer of goo across the entire screen. Colors even seem to be richer, the whole thing is a lot easier to look at for long periods, and simple objects pop out making it actually look somewhat 3D, instead of blending things together in a hard to differentiate manner. Switching SweetFX off and on you can see simple objects which suddenly look important rather than completely irrelevant all over the place. Any game world I've tried this program with so far seems much more compelling visually with sharpness set to a higher level than whatever this monitor defaults to. I sent the manufacturer of the company an email about my dissatisfaction with the fact that they didn't include sharpness as an option in their monitor (it's a gaming monitor with FreeSync) when other companies seem to be able to (I'm not actually sure if their entire line of monitors doesn't have sharpness as an option for digital cables [DVI, HDMI & DisplayPort]). After a pretty long message sent to the tech support they told me they would send my message along to a higher up in the company and I might get a message back if they decide to do something. My main points were I wanted them to find a way to either update the firmware so that I could install that and have sharpness as an option, or at least release all new models and revisions with the option, if they have to make a hardware change to resolve this issue. My last BenQ was actually an old monitor I bought for a similar price around 2008, and it had sharpness as an option. I stopped using it when I got a newer monitor of a higher resolution, but went back to it when that newer monitor died. I realized when I went back to the BenQ that it had a sharpness option, tried it, and realized it looked better than the monitor that died. When the BenQ died recently I tried another BenQ which also had sharpening, it was a VA panel and it had issues that I was not willing to put up with, but the sharpness was nice. The other issues it had resulted in me returning it within the first 14 days of purchase. Then I got the AOC and I was back in muddy image land, until I found LumaSharpen. Anyway, if I could somehow get LumaSharpen to run on the desktop that would be wonderful, but I'm not expecting miracles, just hopeful. If not I'll just have to live with it until I can afford another monitor.
Posted 8 years, 1 month ago
My HDMI monitor has a sharpness option, like every other decent monitor out there. Not sure why you'd want to make your whole pc sharper, though. Can't you just tweak your wallpaper or whatever in a photo editing program, change out fonts you don't like, etc? Also, you could probably get somewhat similar effect by messing around with the display settings in your GPU control panel. Use this to configure everything: http://www.lagom.nl/lcd-test/ If you're hellbent on using SweetFX/ReShade effects on your whole pc, you're out of luck. The injectors have to hook into a 3d rendering api, currently consisting of OpenGL and DirectX 8-11 ClearType is rubbish. It makes it look like a rainbow threw up on my text. Just use normal AA.
Posted 8 years, 1 month ago
My wallpaper is just a big black screen, with icons for things I use all over it. I want to be able to watch videos, look at images, read text and everything with optimal sharpness like I used to, I liked the look of sharpness being applied everywhere on my last monitor. Unless the desktop is already sharpened, but games are not, I'm not sure how it works exactly. As far as video driver options, I'm on an AMD card and as far as I know their drivers don't have such an option. I'm not sure if NVIDIA does either. I did see something that seemed hopeful for NVIDIA cards when I was searching around for an answer to that very question but I never got a solid answer of if it did what I wanted. I'm not likely to get an NVIDIA card any time soon but it would be nice to know if they have the capability of turning up sharpness, and that AMD does not (I've never seen such an option for sharpness in the AMD settings so I assume there's not one). Like I said I've already got options like cleartype all sorted out, what I really need is a way to apply sharpness globally. It's great that I got it running for most games though (an exception I've found is old games like Diablo II). If it can't run sharpening globally in its current state, it would be nice if someone found a way to get it working, even if it were a separate program, I'd be using it all the time. I'd also try comparing how it looks to hardware sharpening on a monitor that supports it, it may be possible that a mixture of the two, or one or the other looks better. In case it helps, I'm running Windows 7 64 bit. Thank you to those who created this program btw, just having sharpening back in my games makes such a HUGE difference to my enjoyment of them.
Posted 8 years, 1 month ago
For what you're wanting, you'd pretty much have to find a program that re-renders your entire desktop in OpenGL or Direct3d. I've never heard of 'AOC' before, so I googled your monitor. It seems to be a somewhat budget monitor with the only selling points being AMD's freesync and 75hz displayport.
Posted 8 years, 1 month ago
Like I said my similarly priced BenQ monitor from 2008 had sharpness as an option, as well as an even cheaper VA BenQ monitor I tried out that was released in the last year I think, but had to return due to issues with VA technology, at least at that price point. I'm not sure why something so simple is not included on every monitor released by any company, but from searching on Google it seems quite common. To be honest there's not a whole lot of information as to why it's not an option other than people claiming the sharpness is already perfect at the one setting, and that changing the sharpness would only make the image worse, which I know is wrong from personal experience, but that was their claim. I've found a few people mentioning specifically BenQ monitors and their sharpness option as something to consider raising if you want additional detail to be visible in game textures, but again the search results seemed few in number. What I also found when I turned sharpness up is that objects seem to go from looking like they blend in with everything else, to everything becoming separate and somewhat 3D looking even when displayed on a 2D plane, I enjoy that appearance far more, and at least I'm able to emulate that effect in games that SweetFX supports. I'm interested in comparing LumaSharpen and monitor hardware sharpening when I eventually get a new monitor, I wonder which of the two is better, or if they work better in combination. One other thing I've found with sharpness turned up is that distant objects and scenery like buildings or mountains will go from being bland and unimportant looking, to being quite clearly visible and standing out, overall making games far more immersive. I assume most people aren't going to bother with this model because they've more money to spend, want larger screens, perhaps desire 144Hz, and because not a whole lot of official reviews have been done on it, so there's not much information other than 1 or 2 reviews done by tech guys, and a few reviews at Amazon and Newegg which all seem to give 4-5 stars from what I remember. I think this model was released late last year or something so it's still fairly new. And yes from what I understand AOC seems to be known for budget oriented equipment, but I consider sharpness as something that should come standard regardless of what cable type you choose to use, even if a variety of manufacturers don't see it that way. When everything else about this monitor is so good it seems silly to leave out sharpness as an option. To me the only explanation is it's a way to get people to buy more expensive monitors (assuming AOC's more expensive ones actually would let me use sharpness on digital cables), because if they made the cheap models look comparable to more expensive ones, they'd lose out on potential profits. So they make it look blurry compared to how you can make more expensive models look, but once someone like me realized they can look just as good when sharpened properly with the help of something like LumaSharpen, suddenly the illusion of the monitor being limited in a way that prevents having sharpness as an option is shattered. From one review I looked at they said through their testing it has close to perfect 6500K color temperature reproduction. Overall for a TN panel they seemed very impressed by its color capabilities compared to other models they were looking at. Part of why it's cheaper I'm assuming is that it's a 21.5" monitor which I really wanted, since I enjoy the dense pixel per inch while I'm sitting so close on my small desk (which eventually I plan to replace with a better desk). Apparently it's the only 21.5" 1080p FreeSync monitor you can buy IIRC. The only feature it lacks that I'm aware of is the ability to adjust sharpness with digital cables, which from what I understand is quite common unfortunately depending on what brand you choose to purchase from. It's also got 1ms GtG response time, but that's common for TN panels these days. I didn't want a low quality IPS for around the same price, resulting in me probably putting up with much slower response times, no FreeSync, and probably not even that great color (I got this monitor on sale btw). I certainly would love to find a way to make this monitor produce sharpness, including your suggestion of re-rendering the desktop in OpenGL or Direct3D. It's clearly capable of displaying a sharper image, they just didn't include it as an option for whatever reason unless you're using a VGA cable, then you're able to use sharpening.
Posted 8 years, 1 month ago
Just use a font changer and download sharper fonts
Posted 8 years, 1 month ago
That won't change the whole screen to make photos and videos sharper, unless of course it's already sharpened which I've not seen any evidence of yet. My text is fine, but as I've seen with monitors that have hardware sharpening for the entire screen, everything looks better, including text (and I'm guessing would make the text look better even if I installed the sharper fonts like you're suggesting). It seems that with all the mods people make this kind of thing would be easy. I'm sure there's a lot of monitors out there that would benefit from being able to sharpen on the software end, until all manufacturers make sure sharpness is possible with digital cables. To be honest I'm surprised the solution I'm looking for hasn't existed for decades already.
Posted 8 years, 1 month ago
Apparently that's not good enough, Quentin. Sharpness isn't a BenQ thing. My Samsung has it, too. Lots of monitors have it. -1ms response time is just a marketing number. A 60hz monitor can only display 1 frame every 16ms. Almost all monitors a response time of 5ms or less. -Same with the 6500k temperature thing. Most monitors made by a reputable company have a "temperature" setting. Simply use it to adjust the whites to be white in accordance with your room lighting. --Not I'm not bashing your monitor. Just observing. Freesync/Gsync is hard to come by, and'll look great with a DisplayPort cable.
Posted 8 years, 1 month ago
I just don't consider changing the fonts to be a solution, it resolves something that's not an issue as I already have the text looking as good as it needs to. I'm among those who complained about Chrome recently getting rid of the option to turn off DirectWrite in the flags settings of their browser and is now using the Cent Browser (it's basically Chrome under another name, I was able to use all of the same browser extentions and everything) because it's a Chromium based browser that has decided to keep that option to turn off DirectWrite. As a result I don't find it to be a headache to read even small amounts of text. I signed the petition asking Chrome developers to reinstate that option, and provided a couple solutions to those who were suffering with blurry text which got the highest amount of +1s of any comments. What I meant was they analyzed the actual color accuracy of various monitors and it came very close to being 6500k instead of being quite far off the mark like a few of the other monitors they were comparing it to, it was the best of the bunch in that regard and earned high praise by the reviewer for its color reproduction accuracy. From what reading I've done it's quite common for monitors to not have sharpness as an option, and people seem to excuse it as if it's an intended feature that doesn't need improving, and that in fact raising the sharpness would degrade image quality (which I strongly disagree with). The only brand I've tried out in the past that I've experience with is BenQ (which always had sharpness options) and another more off-brand model that lacked sharpness as an option, and as you can probably tell it's generally been years between monitors for me, waiting until they fail to work before being replaced. This monitor has a temperature setting, which I've set to "User" aka custom, and got it looking fantastic shortly after purchase. Also the reason I said 1ms GtG response time is because I know that it's different from non-GtG response time. I've never needed a prescription for my eyesight and as far as I'm aware I've greater than 20/20 vision, so having to put up with blur is a big deal to me, since I feel that if you are allowing your eyes to adjust to seeing a blurry image, the rest of the time when you're not looking at the monitor your eyes have to readjust to the environment, and if you're like me and spend many hours in front of the monitor daily, it can have an impact, though I've found a half hour to an hour outside during a decently lit day remedies most of the negative effects the blurry monitor, or dark conditions due to a lack of proper lighting indoors, might have had. The key point is to make sure I'm looking at the environment at various distances throughout my time outside to help my eyes adjust, to return to their natural state, so that I achieve the clarity I know my eyes are capable of at all distances, either very close or far away. Anyway, while most of the features of the monitor are more than adequate (response time is great, ghosting/blur from motion is great, haven't noticed any input lag, color is great, but no hardware sharpening) it seems that this might be the first monitor I replace before its useful lifespan comes to an end, and will probably hand it off to someone in need at that point. I would be really choked if I bought a 144Hz monitor and it didn't have sharpness as an option, and since AOC sells monitors using Freesync for the best price, I wouldn't be surprised that a lot of people are putting up with perpetual blur without even realizing it. Unless of course, they include sharpness as an option for their 144Hz models.... If not, at least it helps the eye care industry create more business I guess. :-( I'm glad that if nothing else comes of this, I at least know what the limitations of my options are if nobody comes up with a way to make it work without being in a game, and like I've said, I'm glad to at least have sharpness as an option for most of the games I play. If someone did find a way to make sharpness work at all times on monitors that don't normally support the feature, I'm sure they could make a decent amount of money off of selling it to those such as myself who don't want to spend $100-200+ on a new monitor, and are satisfied with what they have other than the lack of sharpness options. They could even sell it to a company that would distribute it for them if they can't be bothered to do anything beyond the programming themselves. Anyway, I realize I probably type up more than most people are interested in reading so I guess I'll stop here, unless something else comes up. Thanks for trying. :-)
Posted 8 years, 1 month ago
Why not try Firefox Nightly? I know. I'm saying it's not as big of a deal as you make it out to be. Only monitors I've seen have color accuracy issues were 15 years old. I upgraded from a cheap 21.5" to a decent 27" and it made a night/day difference, even though they were both 1080p60 and whatnot. I've set up 2 new monitors in the last couple years, and they both had a sharpness option. Granted, they were both Samsung. Gray-to-grey is slower than Black-to-White, if I'm not mistaken. So you actually have like 3 ish MS. Again: if it can change from white to black before it displays the next frame (generally 16ms), any other difference is pointless. Moot. You can have less than 20/20 and still see a monitor perfectly due to it being less than a meter away. If you actually have 20/15 or something, join the Airforce. If you really play a lot of games, I would agree due to the uncommon 75hz Freesync. ...I doubt they are? If the blur is as bad as you make it sound, you probably should've actually went to a store to look at monitors first-hand. I'm sure the guys there'd even let you faff around with the settings a bit. (Obviously you don't buy at the store if you want to save some money.) Check the monitor's sharpness here: http://www.lagom.nl/lcd-test/sharpness.php
Posted 8 years, 1 month ago
I've used that lagom.nl/lcd-test/ page before to help configure certain settings of my monitors for the last 10 or so years. I'd checked it over a month ago and it seemed my sharpness is probably considered fine (at least on the desktop). I have to squint my eyes to the point where I'm barely able to see through the slit in my eyes to have my monitor pass the test for both sharpness and gamma, but I can still make out some of the detail of the boxes with the thickest vertical lines (unless I sit more than a meter away), which if anything would probably mean that my sharpness is considered TOO high. I consider being able to see a monitor perfectly the point at which you can still distinguish the individual pixels, then sitting back just enough to where you almost can't see them, or just beyond the point of being able to see them, if the intent is seeing all of the possible detail. Part of why I wanted this smaller screen was back when I had a 23-24" monitor I struggled to see the whole screen at once, having to turn my head very slightly to see the corners sitting at this distance. I was also able to easily distinguish pixels at my normal sitting distance on that larger screen, which didn't appeal to my eyes. The other reason I wanted a smaller screen was for the higher pixel density. Unfortunately it seems the general trend is to produce larger sizes for a given resolution, unless that's just for 1080p Freesync monitors atm, I'm guessing most people prefer larger monitors, but they also likely have a desk that lets them sit further back. I tried Nightly and have it installed still, I made sure to try to enable the new multi-thread feature or whatever it was a month or two ago (had to change a number of settings under the hood to get the browser to say the multi-thread support was enabled, as it was limited to invited beta testers or something along those lines IIRC) and it didn't seem to perform very well when watching Twitch streams at Source quality, even when I forced them to run in HTML5 using a certain URL to enable the beta function. Works very smooth in Chrome though. Cent Browser is just as good as Chrome from the month or so I've spent with it and I'm not worried about RAM since I've 16GB. I can leave a bunch of stuff open and play Planetside 2 if I wanted, which is a RAM hog, but never has gotten much above 10-12GB with everything left open. I did notice that Firefox seemed to have slightly better looking video rendering than Chrome when I compared screenshots, but the performance difference last time I tried it out is enough in Chrome's favor to stick with them. The only reason I tried out Nightly was because I'd not resolved the DirectWrite issue yet. I feel like this monitor has the capability to look perfect for my needs (especially at such a low price), it's just that without a way to up the sharpness a lot of details even for a game that's not in motion are lost behind a wall of blurriness, and only resolved with LumaSharpen. I do get the feeling games are more blurry than the desktop though (the desktop may even be fine, but I'm not sure), and as I've mentioned before I don't really have a way to compare the clarity of the desktop to games, but when I look at screenshots the clarity seems to be there so (perhaps diminished to some extent that's hard to detect), maybe there's no need to fix the sharpening of things other than for games? A game running in windowed mode without LumaSharpen will look blurry, but turned on will look sharp, so I'm not certain what conclusion I should draw from that. I do recall the desktop looking significantly sharper on my BenQ, but maybe games are just a lot worse because of how they're displayed. I'm not sure if AOC monitors are actually at any local stores for me to try out, I didn't check. I was in too much of a rush to replace my recently deceased monitor to wait around for long at the time unfortunately. Speaking of my old monitor... I'm pretty sure all it needs is a few capacitors changed out based on what I saw when I opened it up, it may still have a lot of life in it if I can get it running again, but no Freesync ofc. As I recall it would cost me about $20 with free shipping to get everything I need to fix it, the most expensive thing being the tool I needed to discharge the capacitors safely, $10 Canadian or so for that. While I appreciate the intent many individual men and women go into the military with, I personally am against joining the military due to the corruption I've seen from the outside at the highest and lowest levels, and everything between, not only within the military but within the government that largely decides military policy & actions (as well as the CIA, NSA and other organizations that have a bad history of corruption, that often assist the military in various forms). At most (if it existed in a form I deemed reasonable) I might join a proper militia where I'd the freedom to choose my battles (or refuse to participate without retribution by officials) based on having what I consider sufficient information, probably only fighting in defense in the case of an invasion, rather than engaging in undeclared preemptive war as has been common for quite some time. No idea if I'd make the cut even if I'd tried but as a kid special forces and ranger positions did kind of appeal for a while, only because of the high level of skill involved.
Posted 8 years, 1 month ago
With the lagom test, everything should turn into a gray blob if you stand at the back of the room and look at the monitor normally. If it does, that means the monitor's fine, and you're just somehow used to having the sharpness turned up to 11. I use peripheral vision. The hell is wrong with your Nightly? As of right now, I have the multi-process enabled and can open 100 unique tabs at once without breaking a sweat. Even under that load, it doesn't use more than 3-5 gigs of RAM. I've never had to modify anything 'under-the-hood'. To compare, can't you just take a screenshot of each then flip through them in an image viewer? That was a joke. Calm down, Edward Snowden.
Posted 8 years, 1 month ago
Yeah with this smaller monitor I can use peripheral vision at this range, but not with the larger one, I had to turn my head a bit for that one. This room isn't terribly large but I can still distinguish some details of the image at that website from over 10 feet away. From my testing Twitch.tv on Source mode didn't run smoothly compared to on a Chrome-based browser. As far as I know not long ago the multi-thread option was only enabled for a few beta testers as it was a new feature, unless that was Firefox itself, I may be forgetting. If it was on by default in Nightly then I didn't get enough of a performance increase to justify staying with it. As far as RAM goes I was just pointing at when I leave a bunch of programs open, including a bunch of tabs in my browser (I believe Chrome is known to be a RAM hog if you leave a lot of tabs open, but I've not had any issues due to having 16GB, never gone much above 12GB even when I like you said, opened up 100 active tabs, though I didn't have much else open at the time of testing that out as I recall), as well as running Planetside 2 which I believe can easily get up to around 6GB of RAM usage alone (+5GB of RAM for other things I've left open and the OS) when there's a huge number of players in a small location. I'm still running an OC'd to 3.7GHz i5-750 with a R9 270x, that I hope to upgrade to an RX 480 so I can actually make use of the Freesync feature and get slightly more than twice the GPU performance. Perhaps my CPU is too old to run Source mode at Twitch.tv without some slowdown in Nightly (but it's fine in Chrome)? As I mentioned I've to enable the beta HTML5 mode at Twitch.tv to get it to run really smoothly even on Chrome, Nightly was just really slow by comparison from what I recall, introducing stuttering into the video. I've had to replace the mobo once already (due to a damaged component on the top-right corner of the board) but the CPU seems to be going strong as long as I overclock it for certain games. :) According to some things I've seen though my CPU is basically perfect once overclocked if you want to not bottleneck the RX 480 too much (it does run better with a more powerful CPU though), it's still apparently a worthwhile upgrade for me and is probably about as powerful a GPU as you'd ever want to try matching with an i5-750, as long as you OC the i5. To enable HTML5 at Twitch.tv you take this link and enter the name of the stream where it says STREAMNAME: http://player.twitch.tv/?channel=STREAMNAME&html5&quality=chunked I'd recommend trying it on one of the higher quality streams that push the computer harder, there's a lot of streams that don't take much even at Source mode. I guess I could try running a game in borderless fullscreen and compare it to screenshots taken in normal fullscreen to see if there's any difference, no way to compare the general desktop's appearance though as far as I'm aware, only games in windowed mode. Edward Snowden? You caught me! :O
Posted 8 years, 1 month ago
My 27-inch sits 2 feet away from my face and it's the perfect size for fast games like Battlefield, at least in my opinion. It may be either too sharp or not sharp enough depending on what it looks like. You'll have to check that yourself. Multiprocessing has been on Firefox Nightly for ages, and has become really stable over the last year. It's now On by default in Nightly as far as I'm aware. Beats the tits off of Chrome. We're talking about the same thing, right?: https://nightly.mozilla.org/ That's pretty old. I've used a class computer that had an i7 860 and it was unfathomably slow, even compared to my i5 2400. It even crashed GIMP a few times (though that may have been the RAM. I'm not sure). I've watched an HD stream or two on YouTube before. No lag, unless the connection's having issues. I've watched hundreds of (not live) HTML5 videos across various sites. No recurring issues. I don't watch any Twitch streamers. Too boring. Out of my 35 YouTube subscriptions, there's only a single gaming group there, and they do board games, skits, everything. What exactly are you trying to compare? You're making less sense the more you talk.
Posted 8 years, 1 month ago
Less sense about sharpness in particular? Like I've mentioned LumaSharpen seems to apply in a window, and I seem to be able to take screenshots of games in fullscreen and the results seem similar on the desktop looking at a photo viewer, but I'm not sure if the image I'm seeing in the photoviewer is exactly the same sharpness as what I see in fullscreen. Borderless fullscreen would probably look the same as the photoviewer assuming there is any added blurriness from not running in regular fullscreen. I'm about two feet away from my screen and I find it almost a little too close at times if I want to be able to see almost the whole screen at once without having to glance to a far corner, but the desk isn't long enough to sit back further while using the mouse & keyboard. Yes I'm using the same browser as at that link, after posting my last comment I tried a Twitch.tv stream again to compare and noticed the stutter. I believe YouTube videos have always been easy to run on either Chrome or Firefox, it seems to be a better optimized player there. I will certainly agree that it can be hard to find someone I want to watch at Twitch.tv a lot of the time, but sometimes with certain streamers there's good stuff, typically not the most popular ones but a few that have been around for years and have had some success because of them being an entertaining and interesting person. There's a few relatively unknown gems lurking around at that site, some who don't stream often, but it can be hard to find some of them. Then there's a few pretty popular guys that are just that damn good, and have made a decent living at what they're doing, but haven't had absurd levels of success financially. Admittedly I watch a lot less than I used to, but will sometimes just listen to certain interesting folks like a podcast while doing other things. I also spend a lot of time selectively watching YouTube videos and such, a lot more time than on Twitch, but I enjoy the variety. Been watching YouTube since 2005/2006 I think. Have been subscribed and unsubbed to hundreds of channels at this point as my interests have evolved. Stock clocks are definitely noticeably slower on this CPU. I don't know what your class computer's specs were but it probably had onboard graphics, a slow harddrive and not a lot a of RAM, on top of not being overclocked and using the stock cooler. Could the harddrive have been full and possibly heavily fragmented? I know that at my old high school with even older technology all computers shared the same harddrive (something like that anyway) on a network, so it would run decently fast when not a lot of computers were being used, but INCREDIBLY slow when a lot of computers were busy. I mean, I guess it could have just been the internet slowing down when a lot of people were using it, in many cases where I noticed the computer being slow, but I seem to recall the school computers struggling to open and close programs quickly, and things like editing programs taking a very long time just to open. I've made a lot of small tweaks to help improve performance over time on my computer. I wouldn't be surprised if even a much more powerful CPU than mine would perform somewhat poorly without utilizing such tweaks, although it can probably brute force its way past some limitations just based on running at a higher clock speed, and perhaps because of their more efficient architecture. During the overclock of my CPU there were optional settings that when changed apparently help lower some of the default limitations placed on the computer's performance for the sake of lowering power consumption, in most cases. There was also an option that regulated the voltage in a certain way I think, that when disabled allowed your computer a bit more freedom that might reduce any spikes that cause your CPU to not operate properly for a split second, which specifically was important if you wanted optimal performance when overclocking. Something along those lines anyway, it's been a few years since I was reading about BIOS settings and their effects.
Posted 8 years, 1 month ago
If you take an in-game screenshot, it'll look exactly the same as real-time in-game unless you have bad capture settings. - I went onto Twitch and played a few different active streams on "Source" quality with the HTML 5 setting enabled. Some even looked better than YouTube 1080p. No lag. I ran GIMP off of a 32-gig USB drive, which is plenty fast for what I need. The PC had a Radeon HD 7570 GPU (I think) which isn't good, but still somewhat overkill considering none of the programs it ran could use GPU acceleration. RAM was probably around 4 gigs 64-bit, which is okay-ish since the PC's were pretty barebones in terms of software. Therefore, I would vouch that the bottleneck causing GIMP to run multiple times slower than my home PC would be the old processor. Hertz can only really compare 2 processors of the same family and generation. That's why an old PC my father had with a Pentium 3 @ 3.1 Ghz was knocked out of the park, over the city, through the woods, and into the next country by my Sandy Bridge @ 3.1 Ghz. You've got to consider architecture, IPC count, etc. I've found that 99% of those 'tweaks' are placebo, and the other 1% is extremely situational. Besides overclocking, there's no magic method to boost your CPU/GPU's performance. No modern OS 'restricts' your cpu in any way. One program can max out CPU 1 and the next can max out CPU 2 yet everything's still good because you probably still have 2 left. And for multi-process programs like 7z that can max out all CPU's at once, the only way to get more performance is to literally terminate active processes (or set the hungry application to a higher priority, in which case the OS will suspend others), but that has almost no effect if the rest of your system is idle. Even with my browser open on this page, music playing, and Afterburner logging my hardware statistics I'm using <1% collectively across all 4 of my cores. No registry edits, windows settings, etc. will help a single bit. They can only give the appearance of increased speed (like how I disable animations on my Phone and PC). The only thing that might help is a tweak for a specific application, in which case it's the software developers' fault for being lame.
Posted 8 years, 1 month ago
What I'm saying is if the desktop itself is blurry because it's not having LumaSharpen applied to it, and I'm using an image viewer that is working on the desktop, and does not have LumaSharpen applied to it, then that screenshot will have the blur of the desktop applied to it, therefore making the image that while it does yes, look almost the same as the game in fullscreen, may in fact be slightly out of focus. I'd have to try comparing with a program that I'm able to switch into fullscreen and back out of almost instantly to compare it with the image of the screenshot also set to fullscreen, by pressing alt+tab. If there is a noticeable blur applied to the screenshot due to it displaying on the desktop, then I guess I'd know whether the desktop needs to be sharpened. That's of course unless for whatever reason the image viewer applies sharpening, making the image appear the same as the game did in fullscreen, but the rest of the desktop outside of the image viewer does not have that sharpening. It really depends on how the image viewer program actually functions, and I don't have the answer to that. Yeah you have to have both browsers to compare but for me Chrome browsers definitely don't have the stutter present in Nightly/Firefox, it can be kind of subtle but it's definitely there, it becomes far more obvious to me when I'm not using the HTML5 player. Overall Chrome is known to perform better with video processing for many years from what I understand, it's why many of those similarly complaining about Chrome's removal of the option to disable DirectWrite, was followed by comments of reluctance towards switching back over to Firefox, which used to be the best browser for performance, but that ended years ago. I hoped the multi-thread support being introduced in Firefox browsers would help but it doesn't seem like it was enough in my computer's case. All I know is a lot of people including streamers seemed to be very happy that they were adding HTML5 support, because it was so much less of a strain on hardware than the Flash player. When I compared Firefox vs Chrome there's a notable difference though it may not be that noticeable depending on the type of action going on and what game is being played. Some streams can go from almost choppy at times for split seconds to constantly smooth when I switch browsers. In general whereas Firefox/Nightly felt like it wasn't totally smooth, Chrome does feel that way. I think I heard recently that the 6th generation i5 or maybe it was i7 CPUs were something like 50% faster compared to my 1st generation i5 CPU Hz for Hz. So a 1GHz CPU today would be equivalent to a 1.5GHz CPU back then, something along those lines. To be honest I don't have a new CPU to compare with so I've no idea what the difference would be, the most I've been able to gather is that on new CPUs if you were already getting adequate FPS at the high end, the more notable improvement might be that your minimum FPS would be considerably higher, and that you are less likely to get a sudden drop into the single digits at resolutions greater than 1080p. And yeah I know a lot of the tips out there on the various guides, or individual suggestions I've checked out in the last decade probably don't do much if anything, but I wasn't referring to those generally lame and ineffective suggestions that are generally the first go to ideas for many people. I don't keep a long list of everything I'd done to this computer, it's been more of a natural development of discovering and rediscovering what I consider valid and useful tweaks, but my experience is that it's helped considerably, even if others might claim placebo. I've seen plenty of cases where the "experts" and the majority of people holding an opinion have been wrong on all sides of a variety of issues during my life, looking at both modern and historical instances, and I myself have been wrong many times and done what I can to correct myself and improve upon my execution for the next time, when I've been aware enough to catch myself in the act. As far as running my computer at basically idle like you described, yeah it's no problem for this CPU either. According to the people helping me on my long topic over at Tomshardware my CPU should do fine as long as it's overclocked if I choose to match it with the RX 480 (and this has been confirmed by a video showing benchmark comparisons on the same CPU), and my HX650 PSU is more than able to cover the power demands required to run it at full throttle. Until I've the money to afford a new CPU + Motherboard and possibly a new set of RAM, and whatever else might be needed, I'll be sticking with this one. I was hoping this monitor would be exactly what I was looking for, and it almost is, LumaSharpen definitely helps things, I'd have been perfectly happy for at least a few more years without upgrading anything besides maybe the graphics card, but I'm not sure now when I might change the monitor out.
Posted 8 years, 1 month ago
Like 2 or 3 years ago I switched over to Nightly precisely because Chrome was slow. I had a set of like 150 tabs I'd regularly open simultaneously and Chrome would damn near crash whereas Nightly loaded them all in seconds. You sound the opposite of confident. Depends on what you're doing. I had similar tweaks applied from a 3 year use and later formatted my hard drive to find the performance was literally the same. It very well might do, as long as the game in question doesn't choke out a single core when the 480X runs ahead.
Posted 8 years, 1 month ago
"You sound the opposite of confident." Okay? Please give some context. I don't really care if you think I sound confident, does this matter at all? A person being honest enough, and showing enough confidence in their belief in moral integrity to be able to admit that they've been wrong many times, and attempt to correct that behavior when they become aware of it, is far more praiseworthy than someone with confidence proclaiming that false information is the only truth. I'm willing to question my own feelings and conclusions, I think a person of wisdom is always willing to. Well from my experience Nightly is not very good. I have switched browsers from Firefox to Chrome a few times over the years due to changes that were made, atm Nightly sucks for me, and seems not much better than Firefox in performance. I've no idea what some of your answers (both now and before now) are even in reference to. Probably different tweaks, or your CPU is powerful enough to not make a difference because you already had all the performance you needed. Chances are though the subtle improvements in overall stability I've noticed would help you just as much as they help me. Sorry if I don't have your faith in the idea that everything is perfect right out of the box. I've seen many times how accounts of things that happened 1 minute ago from family members were completely wrong, hard to tell if they're lying or just have a bad memory sometimes. If my faith in your anecdotal account is lacking, please have some forgiveness, we all have our own opinions and reasons for not accepting what others might have us believe. Good, and yes from what actual testing I've seen the i5-750 is quite capable still with the RX 480 (and they didn't even bother to OC the CPU, which I'm still hoping they'll try so I can see the fps comparisons), backed by the input of a number of members on a tech support forum who agreed that the CPU still has a lot of life in it because ever since the i5/i7 lines started, CPUs haven't really made the same leaps they used to back in the Pentium 4 days and similar, only slowly gaining in compute power, but more significantly advancing towards power savings. Some believe they're intentionally milking CPU sales by only incrementally improving the GHz slowly, and not taking huge leaps in the technology, instead trying to give us the impression that we might be somewhere close to the limit of what CPUs are capable of, and will ever be. Meanwhile we got technologies like Vulkan coming out that threaten to make it so that old CPUs like mine will be viable for many years to come. It's a good feeling.
Posted 8 years, 1 month ago
Also fyi, I've been updating a lot of my comments over time and you may have in many cases started reading before I left the updated versions, thought I should mention it for your convenience. This forum does not indicate when a post was last edited so I've no way to know what you might have missed.
Posted 8 years, 1 month ago
Also one thing I just checked to confirm, when you first start up Nightly it only loads the tabs that you actively bring to the forefront by clicking on them, Chrome is just as capable of running with many tabs open as far as I'm aware. Also the Cent Browser I'm using has the exact same behavior of waiting to load tabs after you open the browser after having closed it, they don't actually start displaying the page until you bring the tab to focus by clicking on it. At most it's sent to RAM or the pagefile or something before that. Also what I find is that in the Cent Browser tabs in the background that haven't been used for a while seem to sometimes get sent to the RAM (probably more frequently when the computer is being stressed to some degree), in that they load again for less than a second when I go back over to the tab when I haven't viewed if for a while. Cent Browser differs in some ways from Chrome in that they seem to attempt what they consider to be more elegent and useful ways of doing things, probably adopting certain features like what you are pointing to with the tabs not eating up all the resources by not keeping them in focus all at the same time. Chrome has probably also fixed this issue since you last used it, but I'm not sure. It may be that in this case my computer performs better with Chrome than your computer does, for whatever anomalous reason. Just made my last edit to the last 3 posts, any further updates will be in a new comment.
Posted 8 years, 1 month ago
"I think I heard recently that the 6th generation i5 or maybe it was i7 CPUs were something like 50% faster compared to my 1st generation i5 CPU Hz for Hz. So a 1GHz CPU today would be equivalent to a 1.5GHz CPU back then, something along those lines." Makes it sound like you have no idea what you're talking about, and thus have no confidence in what you're saying. ---- The Intel Core series only really started hammering the 'power savings' since the the last generation. The reason being the manufacturing process limitations. They have to make the manufacturing process smaller in order to fit more transistors on the board (improving computing power). The reason Moore's law is ending is because the transistors are getting too small. A transistor on a CPU works by sending an signal across a field that has a semiconductor in the center. Whether or not the semiconductor allows the signal through gives you an "on" or "off" on the other side, which is the 1's and 0's that are then put through various logic gates to perform calculations. The problem with making the manufacturing process smaller is that the semiconductor block becomes so small that the electrical signal can pretty much phase through it by means of quantum mechanics. Once that limit's hit, you can't put more transistors on the same board. -About the new API thing... DirectX 12 requires a 4th gen (Haswell) Intel Core processor or newer (I'm not sure about vulkan, but lots] more things used DirectX over OpenGL in the past...). - How I configured Nightly: First load is a blank page, all tabs are loaded upon creation whether or not you're viewing them. --
Posted 8 years, 1 month ago
They plan to release Vulkan with support for a lot of old hardware from what I've heard, and many people hope it overtakes the market in place of DX12, because they don't want to be forced to use Windows 10 and encourage Microsoft in a lot of the horrible decisions they've been making for many years. Who the hell wants a continuation of a Microsoft monopoly on the mainly used API? As far as I'm aware Vulkan just requires a supported GPU, but I could be wrong, feel free to check. Yes the tabs appear to load, in that the loading bar goes away, but when you actually click them they properly load. Goes for both browsers. It may not be so obvious if you're using a SSD, I'm still using a HDD, which I'm perfectly satisfied with as it only really effects load times and games with streaming textures, not much else is affected that I use, that I'm aware of, to any degree that I care about at least. Yes, I'm aware of the mainstream explanation for why they're milking us with inferior iterations of what they could be creating, for each generation of new hardware, this extends to most technologies. Going by memory of something I read in passing it seemed like someone who knew what they were talking about added up the performance gains for each generation, and that was around the gain per Hz that I recall them indicating when comparing 1st Generation i5s to 6th Generation i5s or I7s. I don't like to speak in absolutes if I'm not willing to look a small point up to confirm it, because that would be wrong. Your interpretation seems to be the only issue here, a skewed view you're using to try to discredit someone who never claimed anything on that particular point, just wanted to share something they'd gleaned from somewhere a while ago. If anything by showing that I wasn't 100% sure of what I was saying, it makes your job easier because you don't have to go through the extra work of coming up with the actual answer to refute anything I might claim that turned out to be wrong, or at least if you do decide to attempt to correct me, you can do so with the confidence that I'm willing to give what you say a fair chance rather than doubting you to the extent that I refuse to consider what you might say, if you've given me reason not to doubt you. Twisting it into a way of gauging someone's character seems childish and arrogant. To be honest this conversation isn't going anywhere and it seems all you want to do is satisfy your ego by trying to make false claims. Sorry, you can go satiate yourself elsewhere. Again, thanks for trying to help before, unfortunate that there's no proper solution.
Posted 8 years, 1 month ago
I am not to sure what monitor you have? My monitor has sharpen though i have it set real low because sharpen is not the greatest thing in the world and in many ways can make things look worse like adding more jaggies and halo effect that makes gaming and Video look horrible!
Posted 8 years, 1 month ago
I agree you shouldn't raise the sharpness too much, if your monitor is capable of going to extremes, my last BenQ monitor seemed best at either the highest or second-highest setting when I experimented, I don't think it produced halo effects most of the time even at the highest setting. I personally loved the added clarity of it. I believe that BenQ had options from 1-5 for sharpness, but I've heard of other models from BenQ that range from 1-10 or something. It was a 2008 model BenQ though. My current monitor is the AOC g2260vwq6. When I turn on LumaSharpen I get a very similar effect to what I remember the BenQs sharpening having. Details in the textures that are comparably faint become clearly visible, objects don't look completely flat and blended together but instead almost appear 3D, it's great. It makes distinguishing things from one another so much easier.
Posted 8 years, 1 month ago
- I'm using an HDD that's worn down to hell. Still loads them all fine. Grand Theft Auto V's loading time is like 5+ minutes just to spawn in my room. Well, obviously you seem to know everything better than the engineers at Intel, why don't you help them with their homework? Heck they could give you a simple contract: $5000 USD for every problem you solve, but you lose $50000 USD for every problem you get wrong. Come back in a year when you're Bill Gates. ---
Posted 8 years, 1 month ago
Did I claim the engineers are the decision makers? They know what they're taught and paid to know. If I bother to finally play GTA5 I guess I can let you know how long it takes for mine to load up. Most games don't take long. I suppose the main point we need to know is who's computer better runs the two browsers, but based on our conflicting reports there's not likely to be a conclusion we both agree on.
Posted 8 years, 1 month ago
-- It's a mysteryyyyyyy.
Posted 8 years, 1 month ago
- Yuhhuh. Miss treeeeeeeeeeeeeeee.... ---
Posted 8 years, 1 month ago
I'm sorry I guess I'm just not very punny.
Posted 8 years, 1 month ago
;)
Posted 8 years, 1 month ago
So everyone's just going to ignore the fact that Sann-san has an extra half body?
Posted 8 years, 1 month ago
I was going to fail to mention it yes.
Posted 8 years, 1 month ago
Update: Vulkan doesn't even have drivers for the Fermi GPU's, so I doubt it'll work on anything older than haswell. Although, it "supposedly" is intended to work with anything that can run opengl 4 or later, so maybe more support will come in the future.
Posted 8 years, 1 month ago
This guy seems to be running both DX12 & Vulkan on an i5-750, also he didn't OC (overclock) the CPU, just compared the base clocks of the CPU with two different GPUs: https://www.youtube.com/watch?v=pjZ3MmxP0Fg I think all that CPU DX12 support is needed for is if you don't have a dedicated GPU, even for DX12. "Here’s a list of all the graphics cards and INTEGRATED graphics processors that support DirectX 12" Link: http://www.pcworld.com/article/2954260/windows/these-graphics-cards-and-processors-support-windows-10s-directx-12-graphics-tech.html So if you don't have a dedicated GPU with DX12 you can use your integrated graphics CPU that supports DX12.
Posted 8 years, 1 month ago
I ran Battlefield 1 and didn't have a "DX12" video option...
Posted 8 years, 1 month ago
Did you look in the advanced menu? Check this video 6 seconds in. https://www.youtube.com/watch?v=Tw-WKlHNZwI He's running the same CPU as me. The only way I'll be running DX12 is if I get a separate HDD to boot Windows 10 on, I refused to give up the license for my copy of Windows 7 for it (even if it was "free"). At some point if I really want W10 I'll pay for it, but I'll probably only use it to run specific games and head back to my Windows 7 installation for general use. Valve and others seem to be pushing for Vulkan adoption across the board since it will have OS support on a much larger scale. I personally hope they succeed.
Posted 8 years, 1 month ago
I remember before I bought my R9 270x them advertising it as supporting Mantle/Vulkan. Not sure about DX12 support but I plan to get an RX 480 at some point anyway, and chances are that like with many games I've not played until years after release, if ever, I won't care if I can't play DX12-only games like Quantum Break. If I do find myself wanting to see the game's story I'll probably just find the best playthrough on YouTube I can find and watch that like a movie. Like I've done with console exclusive games in the past such as Metal Gear Solid 4.
Posted 8 years, 1 month ago
Yeah. I had to mess with literally every setting to get it playable. People in the chat were complaining about DX12 bugs so I went into my video settings to double-check I didn't miss the option. -- Obviously it'd be nice if everything was all great and universal, but that's not the world we live in. You'll never get games that use Windows-exclusive features on Linux, you'll never be able to play PS3 exclusives like Little Big Planet on PC, etc. etc. **I know there's things like Wine and emulators, but they aren't nearly as functional. It's pretty bloody hard to emulate anything newer than a GameCube/Wii game at a decent speed, Wine's imperfect and also suffers from a moderate performance loss, etc.
Posted 8 years, 1 month ago
I just finished posting the final edit for my last post. Vulkan promises to support both Linux, and Windows 7/8 I believe, while having performance comparable to DX12.
Posted 8 years, 1 month ago
And yes I'm quite aware of the limitations on emulators. It's why I use BSNES/HiGan (it changed name when the guy branched out to support consoles other than just SNES, though I'm not sure how good the emulation for other consoles is) for the SNES, because it has the most accurate emulation of any emulator for that system. It has 3 variations of the emulator you can run, Accuracy Balanced and Performance. Performance compromises sound and graphics accuracy, Balanced has the same sound quality from what I remember reading but compromises on some of the visual accuracy, and Accuracy is supposedly as close to perfectly emulating the actual hardware chips as you can currently get, but comes at a high performance cost even for today's high end hardware probably (you'd probably still notice slowdown at times with the Accuracy .exe on new high end CPUs & GPUs, depending on the game). From what I recall the Balanced .exe is far more accurate than any other emulator that has been released. The goal of this emulator was to create an accurate reproduction of the original hardware of the SNES, emulated on a PC, other emulators do not even attempt to do that, at least for the SNES.
Posted 8 years, 1 month ago
Just 'cause it's fast doesn't mean companies will use it. Most of them use DirectX 9 or 10/11 with what I assume to be HLSL for every one of their PC games. If, say, Bethesda wanted to switch over to Vulkan they'd have to re-train their current employees or hire new ones that know how to work with Vulkan and whatever shading language it uses. That'd be much harder than just learning the new hooks 'n whatnot in a new DirectX version. And companies being corporate, they won't plan far enough ahead as to warrant switching to Vulkan with the hopes of selling better due to it having a higher market share (Windows and Linux).
Posted 8 years, 1 month ago
Yeah but if you do a bit of searching it seems a lot of people are eager for Vulkan and reluctant to accept DX12 because of the limited support. Vulkan has at least one industry giant behind it (probably more but I don't feel like doing further searching atm), and by all rights it should be what's adopted, Microsoft can get bent.
Posted 8 years, 1 month ago
That SNES thing doesn't sound right. I can run Twilight Princess, a large open-world GameCube game, with perfect accuracy and some visual enhancements like Per-pixel lighting, MSAA, AF, etc. @ a decent res with no issues on a system that has to run Battlefield 1 Beta @ 1/4 res scale, minimum settings to get 70 fps. Again, obviously people are going to be eager for it because Windows 10 costs $120 and isn't well-recieved for some reason. But as with DLC(Not like Skyrim Dragonborn dlc, but the overpriced 'you must have this' dlc), Micro-transactions, etc. the developers will likely turn a blind eye for whatever bullocks reason.
Posted 8 years, 1 month ago
I guess all I can say is I hope Vulkan becomes what everyone adopts in the coming years (or that it at least takes a large market share), we shall see what actually happens. The goal of that SNES emulator is accuracy (being exactly like the original SNES hardware down to the individual chips and how they operated), not speed. No other SNES emulator has attempted to emulate the actual hardware like that (nor has an emulator for any other console from what I remember reading, they take a lot of shortcuts for the sake of performance [often at the cost of accuracy] instead). For the SNES if you try to run certain games, there will be times where the sounds are off by a large margin on all other emulators, and I know this because I grew up playing those games on the original SNES hardware. Even SNES9x which many claim has the most accurate sound and is probably the most (or 2nd most) widely adopted emulator for that reason, fails with accuracy when certain kinds of sounds play (but compared to the horrendous sound of ZSNES, SNES9x is a good leap forward in accuracy). Particularly screeching sounds in games like Chrono Trigger are nowhere near to what they sound like originally, those kinds of sounds seem to be the hardest thing to emulate accurately for some reason. I remember when you head to Lavos in the Epoch, I believe that was the sequence that stood out the most, if you compare the sound of that on ZSNES to a more sound-accurate emulator, it sounds horrible, it ruins the scene with how bad it sounds. I tried the newest version of whatever the name of the best PS2 emulator is, a year or two ago, and noticed the many inaccuracies of the emulator compared to when I owned and played on the original PS2 hardware. Chances are the same is true for the GameCube emulator, but I never played on the original hardware, so it's impossible for me to know for sure. Generally it doesn't seem terribly hard for them to get the visuals looking good enough in an emulator, but the sound emulation often seems to be a struggle. It will produce sound that may sound good enough, but if you compared it to the original there will likely be instances with glaring inaccuracy. I just have this tendency to want to play an emulated game as close to the original experience as possible, especially when it comes to the sound. I might choose to turn on anti aliasing and such, which the original game may not have supported, but I like to ensure beforehand that I can get the game looking the same as it did on the original hardware (including the original internal resolution and all of it's pixelated glory), before making adjustments that improve/change the visuals.
Posted 8 years, 1 month ago
Oh I was looking through some reddit posts talking about Vulkan & DX12 and I noticed them mention Bethesda who published DOOM apparently. DOOM has Vulkan support. So apparently they've already trained the employees at id Software who was the developer (with Bethesda Softworks as the publisher) to program for Vulkan, probably because they have prior experience with OpenGl, I'd have to look that up though. I believe I read somewhere that programming for Vulkan was actually a bit easier than DX12 (they also both apparently share a lot of similarities, so if you can program for one you can learn the other quickly), and that it wouldn't take much in terms of time or resources to switch over to it as the main API, and that the cost would certainly be worth it, but I forget what the reasons given were. That on top of having support for OSX, Linux and Windows 7/8/10. As far as I'm aware the only advantage DX12 has is that it has more games out with support for it at the moment because it was released first.
Posted 8 years, 1 month ago
This video is a pretty good summary about Vulkan: https://www.youtube.com/watch?v=dvioALNs_Bc As he says I'm pretty much about making things better for everyone rather than better for a specific company's profits, and their attempts to create a monopoly on the market. It's the same reason I support Freesync over Gsync. I don't want to spend $200 extra on a monitor that's basically the exact same model except for a proprietary chip Gsync forces on the monitor manufacturers. To me while I don't favor owning an Nvidia card or AMD over the other, I feel like AMD did two very good things for gamers and freedom for developers by developing Mantle which became Vulkan, as well as making Freesync a thing to compete with Gsync. Instead of being locked into supporting a particular corporation that charges them for the right to use their proprietary solution, various corporations are free to benefit from these technologies AMD has been involved with the development of without additional costs. This is also why I don't support Windows 10, as the whole thing was designed with DX12 to force everyone to support their planned monopoly. I feel like they struck gold with Windows 7 and I have no concerns about continuing to use this OS likely long (many years) after they discontinue security updates for it, unless a superior alternative to Windows 7 comes out (I don't feel that way about Windows 10 yet, nothing has convinced me, if anything there are features that sound like a turnoff, and as I said before if I really feel the need for it one day, I'll install it on a separate HDD). One thing they didn't mention in the first video is that Vulkan and DX12 are supposed to breathe new life into CPUs new and old by making it possible for the game engine to intelligently delegate CPU resources, as I recall it. They talk about that a bit in this video: https://www.youtube.com/watch?v=tyJY5oUBbt4 I've seen further explanation of the ways it makes older less powerful CPUs viable in gaming for years to come, explanation I found at various places on the web throughout my searching, in videos I've watched as well as in articles and comments in forums and elsewhere I've read.
Posted 8 years, 1 month ago
Firstly, I know what Vulkan and DX12 do. I don't need videos. The bethesda thing was just for argument's sake as they're the first 'AAA' company that popped into my mind. The only inaccuracy I noticed in Twilight Princess is the music's equalizer is a bit messed up with the "Fast" sound option. Turning the audio to the not-fast option fixes that, making the game sound and look completely identical to actual GameCube footage. So, I then add some emulator graphic enhancements, turn off the fake bloom via action replay, and use ReShade to give it a facelift and... voila! Twilight Princess HD @ 100% emulation speed.
Posted 8 years, 1 month ago
Side-note: Quantum Break is DirectX 11.
Posted 8 years, 1 month ago
I personally felt the first video at the very least is worth a watch because of the very sound and important perspective it gives on an almost moral level, but it's your choice. I already knew what the guy was talking about but I still enjoyed the video and felt it was a good summary of points that are likely going to be among the reasons why Vulkan actually does end up defeating DX12, as it should. Did you actually get DX12 running in Battlefield 1 before? You didn't really make that clear, it seemed like you might have. It should run. I don't really trust most people when they tell me something like an emulator is accurate compared to running on actual hardware because many times it's incorrect and not at all up to my personal standard. They either don't notice the issues present or are willing to let a lot of things slide. Watching a little bit of Gamecube footage for 5 minutes, even 30 minutes or more depending on the game, and what went on in the game during that time, often won't be enough to get a proper conclusion on whether the emulator is accurate. You'd have to try several games of various sorts and play them all for a few hours, games that you previously played or watched being played on the original hardware for the same length of time, at the same points in the games, to come to a reasonable conclusion. I guess it's possible that the Gamecube emulator is really well put together compared to emulators for a lot of other systems but I've heavy doubts. They haven't managed to do it on either the PS2 or N64 emulators from my experience, but maybe they did with Gamecube, that's awesome if they did.
Posted 8 years, 1 month ago
Oh, apparently Quantum Break was made to work with DX11 and Windows 7. Neat. :) All I knew for a long time was that it was going to be DX12 exclusive. Looking into it a bit more: "To clarify, the Steam and retail version of @QuantumBreak run on Windows 7 and up (64-bit). #quantumbreak" That tweet was from 10th Aug, 2016, so I guess it was a very recent occurrence. 29th Sep, 2016 is the release date on Steam. So literally yesterday. But the initial release was April 5th, 2016.
Posted 8 years, 1 month ago
Moral level? Why not just an informative, non-biased video?: https://youtu.be/jsxn93Wb7vk No. I never saw the option, even after the update. I have a lot of playtime on Twilight Princess for my GameCube and it felt no different on Dolphin. I mentioned the video because I used it as a memory jog. The only game I felt was worth emulating was Twilight Princess, so trying multiple games is out of the question.
Posted 8 years, 1 month ago
It was unbiased, and informative, but still made a moral arguement, yes it's possible to do so. Also I'm pretty sure I've watched the Linus video recently. Or a few of the newer ones related to the new APIs. That's a shame. At least it will work for me since it worked for the other guy on the same CPU. Okay. As long as it satisfies your needs I guess. The only way to be sure your opinion is correct is to do such testing, but I don't expect you to, unless you wanted to.
Posted 8 years, 1 month ago
Last thing about the emulators: "You'd have to try several games of various sorts and play them all for a few hours, games that you previously played or watched being played on the original hardware for the same length of time, at the same points in the games, to come to a reasonable conclusion." If it takes that much work to find inconsistencies in the emulator, isn't it arguably perfect? If you simply boot up the emulator, load the rom, and play like a reasonable person you'd never know you encountered any minor inconsistencies. Therefore, it's accurate enough to fool a person into believing it's perfect, and it need not do any more.
Posted 8 years, 1 month ago
No it's not arguably perfect, unless your definition of perfection is notable imperfection, and sometimes games work well most of the time, and other games do not. For me the differences I've noticed were enough to give the wrong impression about what the game was like, especially when it came to certain sounds, and you do have to be paying attention and have a sufficient recollection of what it's supposed to be like. I'm a reasonable person, sorry if reasonable people can't fit one definition you might have in mind. I have higher standards, but like I said, if the Gamecube emulator is as good as you say, that's great I guess.
Posted 3 months ago
https://rihana.co.in/attabira-escort https://rihana.co.in/attakulangara-escort https://rihana.co.in/attapur-escort https://rihana.co.in/attur-escort https://rihana.co.in/aundha-escort https://rihana.co.in/aurad-escort https://rihana.co.in/aurai-escort https://rihana.co.in/auraiya-escort https://rihana.co.in/aurangabad-escort https://rihana.co.in/ausa-escort https://rihana.co.in/avadaiyarkoil-escort https://rihana.co.in/avadi-escort https://rihana.co.in/avinashi-escort https://rihana.co.in/awantipora-escort https://rihana.co.in/ayanavaram-escort https://rihana.co.in/azad-nagar-escort https://rihana.co.in/azamnagar-escort https://rihana.co.in/azara-escort https://rihana.co.in/baabur-escort https://rihana.co.in/baba-bakala-escort https://rihana.co.in/babai-escort https://rihana.co.in/babra-escort https://rihana.co.in/babubarhi-escort https://rihana.co.in/babughat-escort https://rihana.co.in/babulgaon-escort https://rihana.co.in/bachhwara-escort https://rihana.co.in/bada-malhera-escort https://rihana.co.in/badachana-escort https://rihana.co.in/badamba-escort https://rihana.co.in/badami-escort https://rihana.co.in/badarpur-escort https://rihana.co.in/badarwas-escort https://rihana.co.in/baddi-escort https://rihana.co.in/bade-rajpur-escort https://rihana.co.in/badhra-escort https://rihana.co.in/badi-escort https://rihana.co.in/badlapur-escort https://rihana.co.in/badmal-escort https://rihana.co.in/badnagar-escort https://rihana.co.in/badnapur-escort https://rihana.co.in/badnawar-escort https://rihana.co.in/badod-escort https://rihana.co.in/badoda-escort https://rihana.co.in/baduria-escort https://rihana.co.in/badwara-escort https://rihana.co.in/bagaha-escort https://rihana.co.in/bagalkot-escort https://rihana.co.in/baganpara-escort https://rihana.co.in/bagasara-escort https://rihana.co.in/bagbahra-escort https://rihana.co.in/bagda-escort https://rihana.co.in/bagepalli-escort https://rihana.co.in/bageshwar-escort https://rihana.co.in/bagha-purana-escort https://rihana.co.in/baghajatin-escort https://rihana.co.in/baghbor-escort https://rihana.co.in/baghmara-escort https://rihana.co.in/baghty-escort https://rihana.co.in/bagicha-escort https://rihana.co.in/bagidora-escort https://rihana.co.in/baglan-escort https://rihana.co.in/bagli-escort
New post

Please log in to post an answer