mattlawn
Veteran Member
back
Posts: 1,488
| Likes: 490
|
Post by mattlawn on Mar 29, 2020 0:09:44 GMT
I just got that same keyboard love it (click click)
|
|
|
Post by Polaris Seltzeris on Mar 29, 2020 1:24:32 GMT
AMD has never been at competing level with NVIDIA, top of the line NVIDIA cards have always blown AMD/ATI out of the water, it's just that NVIDIA is more expensive except it's always been that way. That was true until RDNA enabled AMD to be competitive against NVIDIA, much like how Ryzen enabled AMD to be competitive against Intel. You're right in that RDNA doesn't have top end products, but AMD's main goal this generation was to compete in the mid to high end market, which RDNA has done well. If RDNA was a failure in that regard, NVIDIA's Super cards would not exist. RDNA 2, however, is supposed to compete at the top end, and the performance is looking very promising with the shown performance of the Xbox Series X and PS5, which uses a modified version of that architecture. NVIDIA is also planning a big leap in performance next generation, and if RDNA 2 isn't supposed to be any good then NVIDIA's next generation would just be another ripoff. AMD has been using their cards on consoles for the past decade, that's nothing new, but nobody considers that to be high end at all. The RTX 2080 already has the technology to support more than what it can already support, by future proofing, it's literally ahead of the technology in software making it already extremely high end, but NVIDIA has engineers that constantly optimize their cards and innovate, so the next generation will be even more of that future proofing. AMD cannot compete with that at all and really RDNA is not impressive when you look at what NVIDIA's cards are actually built for. AMD is just not a company of technological innovation, they do the bare minimum of what they have to do in order to exist and people attribute that as "extreme competitiveness" and "super cool" when they're so far behind.
|
|
Deleted
Deleted Member
Posts: 0
|
Post by Deleted on Mar 29, 2020 16:01:05 GMT
I don't know if I'm late to the party, but the IBM model M keyboard is too damn loud, Wilee. I had a model M about 5 years back, and got rid of it because it kept both of my roommates up at odd hours in the night. The buckling spring mechanism, and the vintage style, is good for old-timer PC geeks who live alone, are hyper-obsessed with quality, and couldn't care less about style. I've had a logitech G5 pro keyboard ever since, with as many lights as damn times square... And it hasn't failed me once. Resume CPU discussion :pensive:.
|
|
|
Post by Polaris Seltzeris on Mar 29, 2020 18:58:43 GMT
I don't know if I'm late to the party, but the IBM model M keyboard is too damn loud, Wilee. I had a model M about 5 years back, and got rid of it because it kept both of my roommates up at odd hours in the night. The buckling spring mechanism, and the vintage style, is good for old-timer PC geeks who live alone, are hyper-obsessed with quality, and couldn't care less about style. I've had a logitech G5 pro keyboard ever since, with as many lights as damn times square... And it hasn't failed me once. Resume CPU discussion :pensive:. Solution there is to not have roommates. Also IBM's Windows keyboards aren't clicky and still retain the durable quality without the amazing mechanics. Also not sure what you're talking about "couldn't care less about style" when 80s keyboards have every modern keyboard beat when it comes to style, the beige color is a lot more creative than just black with obnoxious lights (because those definitely don't keep anybody up "at odd hours in the night"), the "stylish" modern RGB keyboards which become riddled with hair and dust after just a few days which is the exact reason you want beige peripherals. Weird that you bring up Logitech which is actually the worst company of any peripheral manufacturer, they certainly don't offshore all of their manufacturing to countries like China where they too profit off of very cheap wage slavery (although unlike some other companies they have a company statement against actual slavery, how bold), and their peripherals are very cheap plastic trash which CAN last you years but also can break after a week, kind of a gambling game when you buy a Logitech peripheral. Very low-quality regardless in about every conceivable way.
|
|
square
Veteran Member
Asst. Creative Designer
Posts: 1,294
| Likes: 1,291
|
Post by square on Mar 29, 2020 22:07:26 GMT
I don't know if I'm late to the party, but the IBM model M keyboard is too damn loud, Wilee. I had a model M about 5 years back, and got rid of it because it kept both of my roommates up at odd hours in the night. The buckling spring mechanism, and the vintage style, is good for old-timer PC geeks who live alone, are hyper-obsessed with quality, and couldn't care less about style. I've had a logitech G5 pro keyboard ever since, with as many lights as damn times square... And it hasn't failed me once. Resume CPU discussion :pensive:. Solution there is to not have roommates. Also IBM's Windows keyboards aren't clicky and still retain the durable quality without the amazing mechanics. Also not sure what you're talking about "couldn't care less about style" when 80s keyboards have every modern keyboard beat when it comes to style, the beige color is a lot more creative than just black with obnoxious lights (because those definitely don't keep anybody up "at odd hours in the night"), the "stylish" modern RGB keyboards which become riddled with hair and dust after just a few days which is the exact reason you want beige peripherals. Weird that you bring up Logitech which is actually the worst company of any peripheral manufacturer, they certainly don't offshore all of their manufacturing to countries like China where they too profit off of very cheap wage slavery (although unlike some other companies they have a company statement against actual slavery, how bold), and their peripherals are very cheap plastic trash which CAN last you years but also can break after a week, kind of a gambling game when you buy a Logitech peripheral. Very low-quality regardless in about every conceivable way. ah yes - get rid of your roomates because you want to have a keyboard: at this point you're unreasonable. does the model m not collect dust and hair? and you still realise you can turn off the lights on every rgb keyboard? whatever, while i agree the model m's superb - in both bucking spring, keycaps and durability however in my opinion it doesn't match to actual good companies or seperate part companies (such as pbt keycaps and backplates) since of the customisability, usage and most of all nkro (yes the model m is actually 2kro, it uses a membrane - ps/2 supported nkro but the actual keyboard doesn't and using a usb adapter doesn't work anyways.) and pressure force. you would not believe the difference between a typing key and a ""gaming"" (short actuation, no click/bump) key on either usage. so yes, the model m dominates the typing area however it doesn't in others, which premintex is in. you say every company uses china sourced materials and labour which is presumptuous of all keyboard manufactorers (you never had any lmao) and blaming people who like personality in their computer builds to make it their own which i find steps from shilling to being rude. yeah, you can like model ms but premintex is not you and you are not him. also onto intel & amd: - amd is superior to intel at current day in peformance (almost all modern applications make use of multi-threaded workloads especially) and dominates over intel in high end (both in price and peformance) where it does more peformance in both single core and multi core and while being MUCH much cheaper. - while yes, nvidia is superior to amd in top end peformance, it is much more reasonable than nvidia in the middle-high end market which is where most people will be buying from. i don't understand your point of "future proofing" nvidias cards like amd doesn't do that when they do? and yeah, amd has a much smaller software engineer team but they're managing to put out more updates than nvidia and in my opinion (i own a gtx card now, getting amd soon) adrenalin offers much more features than nvidia and it's shitty control panel and i'd take it over that. more peformance while being cheaper than gtx 1060-rtx 2070 super is a no brainer and you're trying to defend a company with "high end stuff is better" to a person on a budget. if nvidia lowered prices drastically in the soon times, sure, nvidia would be better but it's not. it's not just peformance my man, price and other things come into view. but you're probably going to respond with "chinese plastic bla bla bla. intel better bla bla bla. nvidia better bla bla bla." and that's fine but to insult people based off of their preferences is a whole new territory dude.
|
|
|
Post by Polaris Seltzeris on Mar 30, 2020 0:23:46 GMT
Solution there is to not have roommates. Also IBM's Windows keyboards aren't clicky and still retain the durable quality without the amazing mechanics. Also not sure what you're talking about "couldn't care less about style" when 80s keyboards have every modern keyboard beat when it comes to style, the beige color is a lot more creative than just black with obnoxious lights (because those definitely don't keep anybody up "at odd hours in the night"), the "stylish" modern RGB keyboards which become riddled with hair and dust after just a few days which is the exact reason you want beige peripherals. Weird that you bring up Logitech which is actually the worst company of any peripheral manufacturer, they certainly don't offshore all of their manufacturing to countries like China where they too profit off of very cheap wage slavery (although unlike some other companies they have a company statement against actual slavery, how bold), and their peripherals are very cheap plastic trash which CAN last you years but also can break after a week, kind of a gambling game when you buy a Logitech peripheral. Very low-quality regardless in about every conceivable way. ah yes - get rid of your roomates because you want to have a keyboard: at this point you're unreasonable. does the model m not collect dust and hair? and you still realise you can turn off the lights on every rgb keyboard? whatever, while i agree the model m's superb - in both bucking spring, keycaps and durability however in my opinion it doesn't match to actual good companies or seperate part companies (such as pbt keycaps and backplates) since of the customisability, usage and most of all nkro (yes the model m is actually 2kro, it uses a membrane - ps/2 supported nkro but the actual keyboard doesn't and using a usb adapter doesn't work anyways.) and pressure force. you would not believe the difference between a typing key and a ""gaming"" (short actuation, no click/bump) key on either usage. so yes, the model m dominates the typing area however it doesn't in others, which premintex is in. you say every company uses china sourced materials and labour which is presumptuous of all keyboard manufactorers (you never had any lmao) and blaming people who like personality in their computer builds to make it their own which i find steps from shilling to being rude. yeah, you can like model ms but premintex is not you and you are not him. also onto intel & amd: - amd is superior to intel at current day in peformance (almost all modern applications make use of multi-threaded workloads especially) and dominates over intel in high end (both in price and peformance) where it does more peformance in both single core and multi core and while being MUCH much cheaper. - while yes, nvidia is superior to amd in top end peformance, it is much more reasonable than nvidia in the middle-high end market which is where most people will be buying from. i don't understand your point of "future proofing" nvidias cards like amd doesn't do that when they do? and yeah, amd has a much smaller software engineer team but they're managing to put out more updates than nvidia and in my opinion (i own a gtx card now, getting amd soon) adrenalin offers much more features than nvidia and it's shitty control panel and i'd take it over that. more peformance while being cheaper than gtx 1060-rtx 2070 super is a no brainer and you're trying to defend a company with "high end stuff is better" to a person on a budget. if nvidia lowered prices drastically in the soon times, sure, nvidia would be better but it's not. it's not just peformance my man, price and other things come into view. but you're probably going to respond with "chinese plastic bla bla bla. intel better bla bla bla. nvidia better bla bla bla." and that's fine but to insult people based off of their preferences is a whole new territory dude. Nobody said "get rid of your roommates", and I was not making an actual argument there, however, I would much rather take a keyboard over having roommates, I'd prefer not having roommates in general because nobody wants that. Beige peripherals do a good job of not highlighting dust and crap which makes it look dirty, which black peripherals do. I didn't say dust magically stopped existing on beige peripherals, but the peripherals look nicer as a result. Not sure what you mean by "actual good companies", there's a reason why even bigger companies that you were praising earlier like Corsair and Razer receive heavy scrutiny which IBM's old peripherals do not get. You mention separate part companies, but for the amount of effort somebody would have to go to to basically assemble their own keyboard they might as well just throw in a PS/2 port and call it a day with a nice IBM keyboard, PS/2 works very well for mice and keyboards, arguably better than USB. Or use an adapter which isn't even that bad. Not sure what "you never had any" means, do you think I'm typing this using the Windows on-screen keyboard and using a touchscreen? It is rationally presumptuous of all manufacturers, but not necessarily only China, that's just one key manufacturing country. It could be Vietnam, Taiwan, I don't care, it's all the same wage slavery and outsourcing. IBM simultaneously manufactured in plants in the working-class heartland of America and in closer trading partners that don't employ wage slavery, like Mexico. Corsair outsources a lot of their sales, marketing, assembly, test, packaging, and distribution to Asia, and that's just to name one, I'm more than happy to look up several of the biggest peripheral manufacturers. This isn't necessarily just a trend in peripherals, this is something that affects every product (I'm more likely to appreciate an American car from the 70s built by a guy with a pension and living wage and a union in a Michigan factory than suicide shops that modern carmakers employ in Asia), and in this case the IBM peripherals win. I don't honestly blame people for buying and using the peripherals, the real blame goes to corporate America for ending craftsmanship and pushing crap products. Some of your points about Intel and AMD are dead wrong, Intel is leading in single-threaded performance and your point about "almost all modern applications" is 100% baseless and has no home in reality. Multi-threaded performance is not utilized by operating systems and software as much as you think it is, and if it was then people would find that the Intel CPUs are perfectly satisfactory in that case because they are being underused in the vast majority of user cases. You want price-to-performance, buy a GTX 1050 Ti. About as cheap as older GTX and still has powerful performance without being as expensive as even a 1060. That's what I use and it works for literally everything I can possibly do perfectly, and I'm certainly not your average end user. Nobody is insulting anybody, but ok.
|
|
tozzit
Veteran Member
Posts: 2,329
| Likes: 1,709
|
Post by tozzit on Mar 30, 2020 3:10:07 GMT
new bad old good
|
|
square
Veteran Member
Asst. Creative Designer
Posts: 1,294
| Likes: 1,291
|
Post by square on Mar 30, 2020 12:17:19 GMT
ah yes - get rid of your roomates because you want to have a keyboard: at this point you're unreasonable. does the model m not collect dust and hair? and you still realise you can turn off the lights on every rgb keyboard? whatever, while i agree the model m's superb - in both bucking spring, keycaps and durability however in my opinion it doesn't match to actual good companies or seperate part companies (such as pbt keycaps and backplates) since of the customisability, usage and most of all nkro (yes the model m is actually 2kro, it uses a membrane - ps/2 supported nkro but the actual keyboard doesn't and using a usb adapter doesn't work anyways.) and pressure force. you would not believe the difference between a typing key and a ""gaming"" (short actuation, no click/bump) key on either usage. so yes, the model m dominates the typing area however it doesn't in others, which premintex is in. you say every company uses china sourced materials and labour which is presumptuous of all keyboard manufactorers (you never had any lmao) and blaming people who like personality in their computer builds to make it their own which i find steps from shilling to being rude. yeah, you can like model ms but premintex is not you and you are not him. also onto intel & amd: - amd is superior to intel at current day in peformance (almost all modern applications make use of multi-threaded workloads especially) and dominates over intel in high end (both in price and peformance) where it does more peformance in both single core and multi core and while being MUCH much cheaper. - while yes, nvidia is superior to amd in top end peformance, it is much more reasonable than nvidia in the middle-high end market which is where most people will be buying from. i don't understand your point of "future proofing" nvidias cards like amd doesn't do that when they do? and yeah, amd has a much smaller software engineer team but they're managing to put out more updates than nvidia and in my opinion (i own a gtx card now, getting amd soon) adrenalin offers much more features than nvidia and it's shitty control panel and i'd take it over that. more peformance while being cheaper than gtx 1060-rtx 2070 super is a no brainer and you're trying to defend a company with "high end stuff is better" to a person on a budget. if nvidia lowered prices drastically in the soon times, sure, nvidia would be better but it's not. it's not just peformance my man, price and other things come into view. but you're probably going to respond with "chinese plastic bla bla bla. intel better bla bla bla. nvidia better bla bla bla." and that's fine but to insult people based off of their preferences is a whole new territory dude. Nobody said "get rid of your roommates", and I was not making an actual argument there, however, I would much rather take a keyboard over having roommates, I'd prefer not having roommates in general because nobody wants that. Beige peripherals do a good job of not highlighting dust and crap which makes it look dirty, which black peripherals do. I didn't say dust magically stopped existing on beige peripherals, but the peripherals look nicer as a result. Not sure what you mean by "actual good companies", there's a reason why even bigger companies that you were praising earlier like Corsair and Razer receive heavy scrutiny which IBM's old peripherals do not get. You mention separate part companies, but for the amount of effort somebody would have to go to to basically assemble their own keyboard they might as well just throw in a PS/2 port and call it a day with a nice IBM keyboard, PS/2 works very well for mice and keyboards, arguably better than USB. Or use an adapter which isn't even that bad. Not sure what "you never had any" means, do you think I'm typing this using the Windows on-screen keyboard and using a touchscreen? It is rationally presumptuous of all manufacturers, but not necessarily only China, that's just one key manufacturing country. It could be Vietnam, Taiwan, I don't care, it's all the same wage slavery and outsourcing. IBM simultaneously manufactured in plants in the working-class heartland of America and in closer trading partners that don't employ wage slavery, like Mexico. Corsair outsources a lot of their sales, marketing, assembly, test, packaging, and distribution to Asia, and that's just to name one, I'm more than happy to look up several of the biggest peripheral manufacturers. This isn't necessarily just a trend in peripherals, this is something that affects every product (I'm more likely to appreciate an American car from the 70s built by a guy with a pension and living wage and a union in a Michigan factory than suicide shops that modern carmakers employ in Asia), and in this case the IBM peripherals win. I don't honestly blame people for buying and using the peripherals, the real blame goes to corporate America for ending craftsmanship and pushing crap products. Some of your points about Intel and AMD are dead wrong, Intel is leading in single-threaded performance and your point about "almost all modern applications" is 100% baseless and has no home in reality. Multi-threaded performance is not utilized by operating systems and software as much as you think it is, and if it was then people would find that the Intel CPUs are perfectly satisfactory in that case because they are being underused in the vast majority of user cases. You want price-to-performance, buy a GTX 1050 Ti. About as cheap as older GTX and still has powerful performance without being as expensive as even a 1060. That's what I use and it works for literally everything I can possibly do perfectly, and I'm certainly not your average end user. Nobody is insulting anybody, but ok. "solution there is not to have roommates" in places here in the uk, spending much more to get individual housing is a lot expensive than just getting another keyboard and i'd go insane with how loud the model m is. not all keyboards are black, either. i didn't praise either and didn't even support razer in the first place (not sure where you got that) and corsair does not get much scrutiny in the first place and the only real drama is between corsair and razer, which is to be expected. it takes literally like 1-3 hours to make a keyboard and to do so allows you to have the best quality parts and make it your own, i'd much cherish something i'd made rather than something someone else has made. ps/2 ports are good and all, but most modern motherboards don't support it anyways (as i said previously) and adapters are just not the best and - whoops - unplugged my ps/2 port and now my computer is frozen and my ps/2 pins are bent. the fact is, ps2 has the same amount of detriments as benefits - especially so when the "better than usb" because of the protocol is incorrect as the delay difference is not noticeable in day to day use. i meant to say you've probably never had any modern keyboard that you're arguing against. i do agree that IBM wins in this factor however we can't do much as the consumer against this, especially with how large corsair is, and so i'd rather have the higher quality keyboard first of all. and yet for high end cpu usage, the difference in single core speeds between intel and amd is none - if not amd wins? the extra cores and threads are much more useful in futureproofing for new games, programs, etc. and if we're talking about high end cpu usage, we're talking about rendering and film editing. what i meant to say was that most applications use multiple threads which goes hand in hand with parallelism, where they can use multiple cores which means with the more threads and cores it would me much more useful to those who need it. and yet a 1050 ti isn't the gpu people want? it can't run most games on high-ultra and for those wanting a larger budget a 1050 ti isn't going to cut it. it works for you, won't work for all. apart from your previous remarks but alright.
|
|
|
Post by Polaris Seltzeris on Mar 30, 2020 17:42:05 GMT
Nobody said "get rid of your roommates", and I was not making an actual argument there, however, I would much rather take a keyboard over having roommates, I'd prefer not having roommates in general because nobody wants that. Beige peripherals do a good job of not highlighting dust and crap which makes it look dirty, which black peripherals do. I didn't say dust magically stopped existing on beige peripherals, but the peripherals look nicer as a result. Not sure what you mean by "actual good companies", there's a reason why even bigger companies that you were praising earlier like Corsair and Razer receive heavy scrutiny which IBM's old peripherals do not get. You mention separate part companies, but for the amount of effort somebody would have to go to to basically assemble their own keyboard they might as well just throw in a PS/2 port and call it a day with a nice IBM keyboard, PS/2 works very well for mice and keyboards, arguably better than USB. Or use an adapter which isn't even that bad. Not sure what "you never had any" means, do you think I'm typing this using the Windows on-screen keyboard and using a touchscreen? It is rationally presumptuous of all manufacturers, but not necessarily only China, that's just one key manufacturing country. It could be Vietnam, Taiwan, I don't care, it's all the same wage slavery and outsourcing. IBM simultaneously manufactured in plants in the working-class heartland of America and in closer trading partners that don't employ wage slavery, like Mexico. Corsair outsources a lot of their sales, marketing, assembly, test, packaging, and distribution to Asia, and that's just to name one, I'm more than happy to look up several of the biggest peripheral manufacturers. This isn't necessarily just a trend in peripherals, this is something that affects every product (I'm more likely to appreciate an American car from the 70s built by a guy with a pension and living wage and a union in a Michigan factory than suicide shops that modern carmakers employ in Asia), and in this case the IBM peripherals win. I don't honestly blame people for buying and using the peripherals, the real blame goes to corporate America for ending craftsmanship and pushing crap products. Some of your points about Intel and AMD are dead wrong, Intel is leading in single-threaded performance and your point about "almost all modern applications" is 100% baseless and has no home in reality. Multi-threaded performance is not utilized by operating systems and software as much as you think it is, and if it was then people would find that the Intel CPUs are perfectly satisfactory in that case because they are being underused in the vast majority of user cases. You want price-to-performance, buy a GTX 1050 Ti. About as cheap as older GTX and still has powerful performance without being as expensive as even a 1060. That's what I use and it works for literally everything I can possibly do perfectly, and I'm certainly not your average end user. Nobody is insulting anybody, but ok. "solution there is not to have roommates" in places here in the uk, spending much more to get individual housing is a lot expensive than just getting another keyboard and i'd go insane with how loud the model m is. not all keyboards are black, either. i didn't praise either and didn't even support razer in the first place (not sure where you got that) and corsair does not get much scrutiny in the first place and the only real drama is between corsair and razer, which is to be expected. it takes literally like 1-3 hours to make a keyboard and to do so allows you to have the best quality parts and make it your own, i'd much cherish something i'd made rather than something someone else has made. ps/2 ports are good and all, but most modern motherboards don't support it anyways (as i said previously) and adapters are just not the best and - whoops - unplugged my ps/2 port and now my computer is frozen and my ps/2 pins are bent. the fact is, ps2 has the same amount of detriments as benefits - especially so when the "better than usb" because of the protocol is incorrect as the delay difference is not noticeable in day to day use. i meant to say you've probably never had any modern keyboard that you're arguing against. i do agree that IBM wins in this factor however we can't do much as the consumer against this, especially with how large corsair is, and so i'd rather have the higher quality keyboard first of all. and yet for high end cpu usage, the difference in single core speeds between intel and amd is none - if not amd wins? the extra cores and threads are much more useful in futureproofing for new games, programs, etc. and if we're talking about high end cpu usage, we're talking about rendering and film editing. what i meant to say was that most applications use multiple threads which goes hand in hand with parallelism, where they can use multiple cores which means with the more threads and cores it would me much more useful to those who need it. and yet a 1050 ti isn't the gpu people want? it can't run most games on high-ultra and for those wanting a larger budget a 1050 ti isn't going to cut it. it works for you, won't work for all. apart from your previous remarks but alright. 99% of computer users aren't going to make their own peripherals regardless of how much time it takes, just being realistic here. Literally no clue where you got your facts about PS/2 because that is not how it works at all, you don't just unplug it and the computer freezes with bent pins. The chance of a PS/2 pin breaking is just as likely as the chance of a USB pin breaking, it's not like USB pins are any more durable, they can and will get used up like anything else. The fact is that for keyboards and mice, PS/2 is what it was made for and the advantages outweigh any of the disadvantages. PS/2 does have the N-key rollover which you were claiming it didn't, it does have low latency, a lot of motherboards actually do still have it, GOOD adapters do work very well but people often don't know which ones to buy, and it takes away the general disadvantages of having to use USB such as not being able to use it instantly. The low-latency of PS/2 actually is noticeable for things such as gaming which you are always bringing up. I actually have, not sure how there's "not much you can do" when you have the ability to select different manufacturers, and suggesting that the IBM Model M is lower quality than a Corsair keyboard is fairly ridiculous by the objective standards that we've gone over. This futureproofing is very specific towards certain things and I'd say just Intel's investments in quantum computing are a lot more "futureproofing" than just throwing in more cores. Plenty enough games simply don't use parallelism enough for just adding a crap ton of cores to have an effect other than making the CPU be able to fry eggs, that goes for actual operating systems (Windows 10 in itself is a joke when it comes to this) and majority of applications. Like I've said the "add as many cores as possible" AMD strategy works until it doesn't become profitable and people fail to notice the real performance difference which the benchmarks cannot account for. What's really happening is Intel has already realized that Moore's law is no longer realistic and hinders innovation, which is why they are doing the research along with the other tech giants into quantum computing. There will be a time when quantum computing is the new demand, and that's where AMD will have zero infrastructure and will fall because they will have been just exploiting the remains of Moore's law, acting like die size is relevant in 2020 and that systems and applications utilize multithreading properly to take advantage of the performance difference, if that performance difference was as impactful as is suggested by AMD then it either would've already been in use by Intel or they would actually use things other than benchmarks to prove the performance difference, which they absolutely fucking would in all of their sales and advertising efforts if it was true. Yes, people with a larger budget will want more than a 1050 Ti, I'm just saying that even if you're wanting to throw money away you don't necessarily need more power if you don't use it. I'm not going to embrace retarded conspicuous consumption of hardware that would be more useful to the people that need it. Also not sure what "people" you are referring to, I don't think the entire GPU market is just gamers and suggesting this dumbs down the conversation a lot. Saying "most games" won't work with a 1050 Ti on high-ultra is baseless, but also those settings are there for a reason and if used properly people can play a game perfectly satisfactory without needing to spend more unnecessary money. Regardless of that, there are better GTX and RTX GPUs than the 1050 Ti if you for whatever reason want more than the kind of power that a 1050 Ti has.
|
|
square
Veteran Member
Asst. Creative Designer
Posts: 1,294
| Likes: 1,291
|
Post by square on Mar 30, 2020 18:39:00 GMT
"solution there is not to have roommates" in places here in the uk, spending much more to get individual housing is a lot expensive than just getting another keyboard and i'd go insane with how loud the model m is. not all keyboards are black, either. i didn't praise either and didn't even support razer in the first place (not sure where you got that) and corsair does not get much scrutiny in the first place and the only real drama is between corsair and razer, which is to be expected. it takes literally like 1-3 hours to make a keyboard and to do so allows you to have the best quality parts and make it your own, i'd much cherish something i'd made rather than something someone else has made. ps/2 ports are good and all, but most modern motherboards don't support it anyways (as i said previously) and adapters are just not the best and - whoops - unplugged my ps/2 port and now my computer is frozen and my ps/2 pins are bent. the fact is, ps2 has the same amount of detriments as benefits - especially so when the "better than usb" because of the protocol is incorrect as the delay difference is not noticeable in day to day use. i meant to say you've probably never had any modern keyboard that you're arguing against. i do agree that IBM wins in this factor however we can't do much as the consumer against this, especially with how large corsair is, and so i'd rather have the higher quality keyboard first of all. and yet for high end cpu usage, the difference in single core speeds between intel and amd is none - if not amd wins? the extra cores and threads are much more useful in futureproofing for new games, programs, etc. and if we're talking about high end cpu usage, we're talking about rendering and film editing. what i meant to say was that most applications use multiple threads which goes hand in hand with parallelism, where they can use multiple cores which means with the more threads and cores it would me much more useful to those who need it. and yet a 1050 ti isn't the gpu people want? it can't run most games on high-ultra and for those wanting a larger budget a 1050 ti isn't going to cut it. it works for you, won't work for all. apart from your previous remarks but alright. 99% of computer users aren't going to make their own peripherals regardless of how much time it takes, just being realistic here. Literally no clue where you got your facts about PS/2 because that is not how it works at all, you don't just unplug it and the computer freezes with bent pins. The chance of a PS/2 pin breaking is just as likely as the chance of a USB pin breaking, it's not like USB pins are any more durable, they can and will get used up like anything else. The fact is that for keyboards and mice, PS/2 is what it was made for and the advantages outweigh any of the disadvantages. PS/2 does have the N-key rollover which you were claiming it didn't, it does have low latency, a lot of motherboards actually do still have it, GOOD adapters do work very well but people often don't know which ones to buy, and it takes away the general disadvantages of having to use USB such as not being able to use it instantly. The low-latency of PS/2 actually is noticeable for things such as gaming which you are always bringing up. I actually have, not sure how there's "not much you can do" when you have the ability to select different manufacturers, and suggesting that the IBM Model M is lower quality than a Corsair keyboard is fairly ridiculous by the objective standards that we've gone over. This futureproofing is very specific towards certain things and I'd say just Intel's investments in quantum computing are a lot more "futureproofing" than just throwing in more cores. Plenty enough games simply don't use parallelism enough for just adding a crap ton of cores to have an effect other than making the CPU be able to fry eggs, that goes for actual operating systems (Windows 10 in itself is a joke when it comes to this) and majority of applications. Like I've said the "add as many cores as possible" AMD strategy works until it doesn't become profitable and people fail to notice the real performance difference which the benchmarks cannot account for. What's really happening is Intel has already realized that Moore's law is no longer realistic and hinders innovation, which is why they are doing the research along with the other tech giants into quantum computing. There will be a time when quantum computing is the new demand, and that's where AMD will have zero infrastructure and will fall because they will have been just exploiting the remains of Moore's law, acting like die size is relevant in 2020 and that systems and applications utilize multithreading properly to take advantage of the performance difference, if that performance difference was as impactful as is suggested by AMD then it either would've already been in use by Intel or they would actually use things other than benchmarks to prove the performance difference, which they absolutely fucking would in all of their sales and advertising efforts if it was true. Yes, people with a larger budget will want more than a 1050 Ti, I'm just saying that even if you're wanting to throw money away you don't necessarily need more power if you don't use it. I'm not going to embrace retarded conspicuous consumption of hardware that would be more useful to the people that need it. Also not sure what "people" you are referring to, I don't think the entire GPU market is just gamers and suggesting this dumbs down the conversation a lot. Saying "most games" won't work with a 1050 Ti on high-ultra is baseless, but also those settings are there for a reason and if used properly people can play a game perfectly satisfactory without needing to spend more unnecessary money. Regardless of that, there are better GTX and RTX GPUs than the 1050 Ti if you for whatever reason want more than the kind of power that a 1050 Ti has. oh lord you're still going making your own peripherals was an example that i made and the fact is, with older hardware, yes it does freeze computers. ps/2 was not meant to be hotswappable in any means so unlike usb, its much less convenient in travel. me saying that it will bend pins easily is an overexaggeration however it does happen, much more than usb pins, and out of my experience i've had many ps2 bent pins while trying to transport my keyboard - may not happen to you but i'd thought i'd bring it up. while i do see some motherboard with ps/2 pins, usb is become much more common and i wouldn't rely on it for futureproofing (since you're talking about that so much) and to use a dongle may seem alright but i'm not the best with dongles. you completely disregarded what i said about there is no noticeable difference in latency. yes, ps/2 interrupts the cpu in it's protocol which is faster than usb but in day to day use - it is not noticeable (as i said) and i don't know how you're saying it's noticeable unless you game 24/7. i was talking about gaming because of the actual keys, where buckling springs are more heavier than cherry quick keys or red keys which can be typed on much easier. never once did i say ibm is lower quality but alright who the fuck cares about quantum processing lmao? you're bringing this up on a discussion about regular cpus and you do know that amd doesn't just throw in cores? and yes, cpu intensive programs do try and optimise cores and threads in order to run better and if you're not using those cores, it won't increase in temperature? intel also has really high temperatures and tdps at large cpus so i don't see your point here. ah yes, explain how 3990>2990 and so on and how it dominates intels just because "it has more cores" (https://www.youtube.com/watch?v=6YbcgiYja3w and more) and dude, it's fucking 2020 explain to me how your cpu will last until quantum processing is mainstream if you're getting good gpus you're obviously using it for a reason - whether it'd be gaming or rendering. " I'm just saying that even if you're wanting to throw money away you don't necessarily need more power if you don't use it" what? this whole time i've been talking about how if you want more power to do more intensive shit, you don't have to spend as much getting nvidia for the mean time. if you don't need power, just don't get a new gpu lmao? "embrace retarded conspicuous consumption of hardware that would be more useful to the people that need it" nice alliteration, see above on how people who need power should get a better gpu and apparently playing games is a "retarded consumption of hardware" - nice one. "Saying "most games" won't work with a 1050 Ti on high-ultra is baseless, but also those settings are there for a reason" it's not baseless, i have a 1050 ti and it can't do shit on high-ultra modern games and people who tell you that you can run ultra on modern games and get over 30-40 fps are just delusional. " the entire GPU market is just gamers" never said it was, i was directing this towards premintex, who is. "people can play a game perfectly satisfactory without needing to spend more unnecessary money." yes, i never said you had to upgrade. however, for people who want to for the upcoming years of game development or software development - go ahead! "there are better GTX and RTX GPUs than the 1050 Ti if you for whatever reason want more than the kind of power that a 1050 Ti has" if you want a gtx, get a gtx. if you want an rtx, get an rtx. i'm just saying that amd have great options available for the price which can probably beat nvidia unless you are streaming and want nvidia's codec. again dude, you're not everybody and just because you don't play high end games who want it to look nice at a reasonable framerate doesn't mean you have to push your opinion on how it's a waste of money
|
|
|
Post by Polaris Seltzeris on Mar 30, 2020 20:28:44 GMT
99% of computer users aren't going to make their own peripherals regardless of how much time it takes, just being realistic here. Literally no clue where you got your facts about PS/2 because that is not how it works at all, you don't just unplug it and the computer freezes with bent pins. The chance of a PS/2 pin breaking is just as likely as the chance of a USB pin breaking, it's not like USB pins are any more durable, they can and will get used up like anything else. The fact is that for keyboards and mice, PS/2 is what it was made for and the advantages outweigh any of the disadvantages. PS/2 does have the N-key rollover which you were claiming it didn't, it does have low latency, a lot of motherboards actually do still have it, GOOD adapters do work very well but people often don't know which ones to buy, and it takes away the general disadvantages of having to use USB such as not being able to use it instantly. The low-latency of PS/2 actually is noticeable for things such as gaming which you are always bringing up. I actually have, not sure how there's "not much you can do" when you have the ability to select different manufacturers, and suggesting that the IBM Model M is lower quality than a Corsair keyboard is fairly ridiculous by the objective standards that we've gone over. This futureproofing is very specific towards certain things and I'd say just Intel's investments in quantum computing are a lot more "futureproofing" than just throwing in more cores. Plenty enough games simply don't use parallelism enough for just adding a crap ton of cores to have an effect other than making the CPU be able to fry eggs, that goes for actual operating systems (Windows 10 in itself is a joke when it comes to this) and majority of applications. Like I've said the "add as many cores as possible" AMD strategy works until it doesn't become profitable and people fail to notice the real performance difference which the benchmarks cannot account for. What's really happening is Intel has already realized that Moore's law is no longer realistic and hinders innovation, which is why they are doing the research along with the other tech giants into quantum computing. There will be a time when quantum computing is the new demand, and that's where AMD will have zero infrastructure and will fall because they will have been just exploiting the remains of Moore's law, acting like die size is relevant in 2020 and that systems and applications utilize multithreading properly to take advantage of the performance difference, if that performance difference was as impactful as is suggested by AMD then it either would've already been in use by Intel or they would actually use things other than benchmarks to prove the performance difference, which they absolutely fucking would in all of their sales and advertising efforts if it was true. Yes, people with a larger budget will want more than a 1050 Ti, I'm just saying that even if you're wanting to throw money away you don't necessarily need more power if you don't use it. I'm not going to embrace retarded conspicuous consumption of hardware that would be more useful to the people that need it. Also not sure what "people" you are referring to, I don't think the entire GPU market is just gamers and suggesting this dumbs down the conversation a lot. Saying "most games" won't work with a 1050 Ti on high-ultra is baseless, but also those settings are there for a reason and if used properly people can play a game perfectly satisfactory without needing to spend more unnecessary money. Regardless of that, there are better GTX and RTX GPUs than the 1050 Ti if you for whatever reason want more than the kind of power that a 1050 Ti has. oh lord you're still going making your own peripherals was an example that i made and the fact is, with older hardware, yes it does freeze computers. ps/2 was not meant to be hotswappable in any means so unlike usb, its much less convenient in travel. me saying that it will bend pins easily is an overexaggeration however it does happen, much more than usb pins, and out of my experience i've had many ps2 bent pins while trying to transport my keyboard - may not happen to you but i'd thought i'd bring it up. while i do see some motherboard with ps/2 pins, usb is become much more common and i wouldn't rely on it for futureproofing (since you're talking about that so much) and to use a dongle may seem alright but i'm not the best with dongles. you completely disregarded what i said about there is no noticeable difference in latency. yes, ps/2 interrupts the cpu in it's protocol which is faster than usb but in day to day use - it is not noticeable (as i said) and i don't know how you're saying it's noticeable unless you game 24/7. i was talking about gaming because of the actual keys, where buckling springs are more heavier than cherry quick keys or red keys which can be typed on much easier. never once did i say ibm is lower quality but alright who the fuck cares about quantum processing lmao? you're bringing this up on a discussion about regular cpus and you do know that amd doesn't just throw in cores? and yes, cpu intensive programs do try and optimise cores and threads in order to run better and if you're not using those cores, it won't increase in temperature? intel also has really high temperatures and tdps at large cpus so i don't see your point here. ah yes, explain how 3990>2990 and so on and how it dominates intels just because "it has more cores" (https://www.youtube.com/watch?v=6YbcgiYja3w and more) and dude, it's fucking 2020 explain to me how your cpu will last until quantum processing is mainstream if you're getting good gpus you're obviously using it for a reason - whether it'd be gaming or rendering. " I'm just saying that even if you're wanting to throw money away you don't necessarily need more power if you don't use it" what? this whole time i've been talking about how if you want more power to do more intensive shit, you don't have to spend as much getting nvidia for the mean time. if you don't need power, just don't get a new gpu lmao? "embrace retarded conspicuous consumption of hardware that would be more useful to the people that need it" nice alliteration, see above on how people who need power should get a better gpu and apparently playing games is a "retarded consumption of hardware" - nice one. "Saying "most games" won't work with a 1050 Ti on high-ultra is baseless, but also those settings are there for a reason" it's not baseless, i have a 1050 ti and it can't do shit on high-ultra modern games and people who tell you that you can run ultra on modern games and get over 30-40 fps are just delusional. " the entire GPU market is just gamers" never said it was, i was directing this towards premintex, who is. "people can play a game perfectly satisfactory without needing to spend more unnecessary money." yes, i never said you had to upgrade. however, for people who want to for the upcoming years of game development or software development - go ahead! "there are better GTX and RTX GPUs than the 1050 Ti if you for whatever reason want more than the kind of power that a 1050 Ti has" if you want a gtx, get a gtx. if you want an rtx, get an rtx. i'm just saying that amd have great options available for the price which can probably beat nvidia unless you are streaming and want nvidia's codec. again dude, you're not everybody and just because you don't play high end games who want it to look nice at a reasonable framerate doesn't mean you have to push your opinion on how it's a waste of money Since when are we talking about older hardware? With newer hardware, either there will be a PS/2 port which works for a keyboard and mouse better than a USB port will or you can at least use a correct PS/2 to USB adapter which works just fine. If your argument is that your fingers can't press buckling-spring keys then at least PS/2 will make up for it. It's just a matter of getting used to an actually good mechanism over the usual boring cherry mechanism which breaks easily and has none of that fine quality and style. You stated that you'd rather have the "higher quality keyboard", implying that IBM is lesser quality which is frankly ridiculous. Yes, who cares about actual technological innovation. For somebody trying to make the point that Intel isn't doing enough, you're simultaneously saying "who the fuck cares" to one of the biggest technological and scientific innovations in modern human history. AMD is just incrementally going down the path of Moore's law, that's zero innovation but allows them to boast before they get destroyed later on when that inevitably stops working. Quantum processing isn't a matter of if it's a matter of when, and AMD has no infrastructure there, the future is going to be all about Intel and the other companies actually developing the technology. "Explain to me how your cpu will last" is kind of ridiculous when computer companies are still selling laptops with Core i5s and i7s made several years ago, and yet are entirely fast enough for what anybody needs along with whatever NVIDIA GPU they toss in there. Right now Intel is still profiting off of the CPUs that they made *years ago*, aside from idiots buying macbooks with Core i9s that thermal throttle and people building their own computers with Intel, we haven't even gotten to the point where new-generation Intel is in action. Meanwhile AMD is only capable of boasting about their new generations because their old generations suck even more ass. I'll admit that it's a bad thing that Intel CPUs from like 6 years ago are still perfectly viable, but that's only because of the anti-innovation that AMD is contributing to right now with Moore's law which Intel is trying to *avoid*. There is no "futureproofing", there is no future for the CPUs that we're using, we're using the same technology from 20 years ago, just with the incremental improvements. Intel is working on the innovation, AMD is not. You completely misunderstood the point what I was making. My point is that people with huge budgets that ARE NOT USING THE HIGHER POWER can still not waste that money on a powerful GPU that they won't even utilize when a 1050 Ti is perfectly viable for whatever the hell they are doing, and yes gaming and rendering actually does fall into the scope of what the 1050 Ti can handle, it's not an iGPU and for some reason people like to diss any CPU or GPU that isn't the brand new high-end thing as if it's already 15 years old and can only work Microsoft Office when it can realistically still run higher-end things. Conspicuous consumption is the term to use when people waste money on things they don't need just to be able to flaunt it. Not only do I not know what you're referring to when you say "high-ultra modern games", but there are endless ways to configure GPU performance even just putting aside the game's own graphics configurations. I'm not going to troubleshoot your computer but if other people are able to do what you can't do with the same GPU then they probably aren't just lying. If people are actually using software which demands more than whatever GPU they already have, then I'm not saying it's a waste of money to upgrade, you're misunderstanding my point again. I'm saying that people should not waste money on hardware that they WON'T use, people don't need an RTX 2080 to run a game believe it or not, there are people out there that actually have a realistic demand for an RTX 2080.
|
|
square
Veteran Member
Asst. Creative Designer
Posts: 1,294
| Likes: 1,291
|
Post by square on Mar 30, 2020 21:10:58 GMT
oh lord you're still going making your own peripherals was an example that i made and the fact is, with older hardware, yes it does freeze computers. ps/2 was not meant to be hotswappable in any means so unlike usb, its much less convenient in travel. me saying that it will bend pins easily is an overexaggeration however it does happen, much more than usb pins, and out of my experience i've had many ps2 bent pins while trying to transport my keyboard - may not happen to you but i'd thought i'd bring it up. while i do see some motherboard with ps/2 pins, usb is become much more common and i wouldn't rely on it for futureproofing (since you're talking about that so much) and to use a dongle may seem alright but i'm not the best with dongles. you completely disregarded what i said about there is no noticeable difference in latency. yes, ps/2 interrupts the cpu in it's protocol which is faster than usb but in day to day use - it is not noticeable (as i said) and i don't know how you're saying it's noticeable unless you game 24/7. i was talking about gaming because of the actual keys, where buckling springs are more heavier than cherry quick keys or red keys which can be typed on much easier. never once did i say ibm is lower quality but alright who the fuck cares about quantum processing lmao? you're bringing this up on a discussion about regular cpus and you do know that amd doesn't just throw in cores? and yes, cpu intensive programs do try and optimise cores and threads in order to run better and if you're not using those cores, it won't increase in temperature? intel also has really high temperatures and tdps at large cpus so i don't see your point here. ah yes, explain how 3990>2990 and so on and how it dominates intels just because "it has more cores" (https://www.youtube.com/watch?v=6YbcgiYja3w and more) and dude, it's fucking 2020 explain to me how your cpu will last until quantum processing is mainstream if you're getting good gpus you're obviously using it for a reason - whether it'd be gaming or rendering. " I'm just saying that even if you're wanting to throw money away you don't necessarily need more power if you don't use it" what? this whole time i've been talking about how if you want more power to do more intensive shit, you don't have to spend as much getting nvidia for the mean time. if you don't need power, just don't get a new gpu lmao? "embrace retarded conspicuous consumption of hardware that would be more useful to the people that need it" nice alliteration, see above on how people who need power should get a better gpu and apparently playing games is a "retarded consumption of hardware" - nice one. "Saying "most games" won't work with a 1050 Ti on high-ultra is baseless, but also those settings are there for a reason" it's not baseless, i have a 1050 ti and it can't do shit on high-ultra modern games and people who tell you that you can run ultra on modern games and get over 30-40 fps are just delusional. " the entire GPU market is just gamers" never said it was, i was directing this towards premintex, who is. "people can play a game perfectly satisfactory without needing to spend more unnecessary money." yes, i never said you had to upgrade. however, for people who want to for the upcoming years of game development or software development - go ahead! "there are better GTX and RTX GPUs than the 1050 Ti if you for whatever reason want more than the kind of power that a 1050 Ti has" if you want a gtx, get a gtx. if you want an rtx, get an rtx. i'm just saying that amd have great options available for the price which can probably beat nvidia unless you are streaming and want nvidia's codec. again dude, you're not everybody and just because you don't play high end games who want it to look nice at a reasonable framerate doesn't mean you have to push your opinion on how it's a waste of money Since when are we talking about older hardware? With newer hardware, either there will be a PS/2 port which works for a keyboard and mouse better than a USB port will or you can at least use a correct PS/2 to USB adapter which works just fine. If your argument is that your fingers can't press buckling-spring keys then at least PS/2 will make up for it. It's just a matter of getting used to an actually good mechanism over the usual boring cherry mechanism which breaks easily and has none of that fine quality and style. You stated that you'd rather have the "higher quality keyboard", implying that IBM is lesser quality which is frankly ridiculous. Yes, who cares about actual technological innovation. For somebody trying to make the point that Intel isn't doing enough, you're simultaneously saying "who the fuck cares" to one of the biggest technological and scientific innovations in modern human history. AMD is just incrementally going down the path of Moore's law, that's zero innovation but allows them to boast before they get destroyed later on when that inevitably stops working. Quantum processing isn't a matter of if it's a matter of when, and AMD has no infrastructure there, the future is going to be all about Intel and the other companies actually developing the technology. "Explain to me how your cpu will last" is kind of ridiculous when computer companies are still selling laptops with Core i5s and i7s made several years ago, and yet are entirely fast enough for what anybody needs along with whatever NVIDIA GPU they toss in there. Right now Intel is still profiting off of the CPUs that they made *years ago*, aside from idiots buying macbooks with Core i9s that thermal throttle and people building their own computers with Intel, we haven't even gotten to the point where new-generation Intel is in action. Meanwhile AMD is only capable of boasting about their new generations because their old generations suck even more ass. I'll admit that it's a bad thing that Intel CPUs from like 6 years ago are still perfectly viable, but that's only because of the anti-innovation that AMD is contributing to right now with Moore's law which Intel is trying to *avoid*. There is no "futureproofing", there is no future for the CPUs that we're using, we're using the same technology from 20 years ago, just with the incremental improvements. Intel is working on the innovation, AMD is not. You completely misunderstood the point what I was making. My point is that people with huge budgets that ARE NOT USING THE HIGHER POWER can still not waste that money on a powerful GPU that they won't even utilize when a 1050 Ti is perfectly viable for whatever the hell they are doing, and yes gaming and rendering actually does fall into the scope of what the 1050 Ti can handle, it's not an iGPU and for some reason people like to diss any CPU or GPU that isn't the brand new high-end thing as if it's already 15 years old and can only work Microsoft Office when it can realistically still run higher-end things. Conspicuous consumption is the term to use when people waste money on things they don't need just to be able to flaunt it. Not only do I not know what you're referring to when you say "high-ultra modern games", but there are endless ways to configure GPU performance even just putting aside the game's own graphics configurations. I'm not going to troubleshoot your computer but if other people are able to do what you can't do with the same GPU then they probably aren't just lying. If people are actually using software which demands more than whatever GPU they already have, then I'm not saying it's a waste of money to upgrade, you're misunderstanding my point again. I'm saying that people should not waste money on hardware that they WON'T use, people don't need an RTX 2080 to run a game believe it or not, there are people out there that actually have a realistic demand for an RTX 2080. it's not that you can't press buckling spring keys, it's that the pressure force required is not the best for actual competitive or casual but often fps gaming and that the clicky feel (trust me, i know the difference) is quite difficult to convert over to proper gaming while they were designed for typing. sure, you can get along fine with it but for people who don't want these types of keyboards just because they're "more durable" to have it because they want to dedicate themselves to something else is not the best. again, that's my point: it's not for everyone. also yes, compared to some keyboards, ibm models can have lesser quality and it's not ridiculous because it's true. the reason why i'm saying "who the fuck cares" is because we're not talking about funding: while, yes, intel is funding for future research - it doesn't make amd any less better than intel in the cpu industry. i don't understand why you keep bringing this up when we're not even talking about this (because i said future proofing? when i say future proofing i mean for the individual) hell, amd is doing lots of innovation when it comes to graphic cards and cpus - they've made 7mm architecture for graphics cards and so on. i admit, amd was ass 6 years ago, but it's now we're talking about and they're pumping out great cpus for the price and peformance and to downsize it because intel is making quantum technological innovation is stupid. we're talking about computer builds here - not quantum processing. i mean, i agree with you that people with huge budgets are not using the power they are paying for however we were talking about premintex's build, were we not? and so, i was not talking about huge budgets in the first place since that's why i recommend amd - if you have a high budget (2080-2080 ti) go and get nvidia, they're great. and same with getting a 1050 ti, if you have a low budget or do not need the power - get a 1050 ti. playing high-ultra on modern games is impossible with older hardware such as the 1050 ti and i can back that up from experience. if i have money and i spend most of my hobby time playing games, i would like to play games comfortably and with satisfaction - not lag and etc. turning down the graphics or even configuring the graphic configurations (these barely give 3-6 extra fps by the way) just doesn't feel right either. overclocking is one bit too far and i doubt regular people would do it (i overclock personally, but i have a well ventilated room). we're living in a time where people are buying ~$5000 builds to play fortnite or minecraft and that's it however, when people are buying budget builds they are trying to get the peformance they need for a low price. especially premintex and myself, i need peformance to get good fps on higher end games and so on.
|
|
|
Post by Polaris Seltzeris on Mar 30, 2020 22:01:18 GMT
Since when are we talking about older hardware? With newer hardware, either there will be a PS/2 port which works for a keyboard and mouse better than a USB port will or you can at least use a correct PS/2 to USB adapter which works just fine. If your argument is that your fingers can't press buckling-spring keys then at least PS/2 will make up for it. It's just a matter of getting used to an actually good mechanism over the usual boring cherry mechanism which breaks easily and has none of that fine quality and style. You stated that you'd rather have the "higher quality keyboard", implying that IBM is lesser quality which is frankly ridiculous. Yes, who cares about actual technological innovation. For somebody trying to make the point that Intel isn't doing enough, you're simultaneously saying "who the fuck cares" to one of the biggest technological and scientific innovations in modern human history. AMD is just incrementally going down the path of Moore's law, that's zero innovation but allows them to boast before they get destroyed later on when that inevitably stops working. Quantum processing isn't a matter of if it's a matter of when, and AMD has no infrastructure there, the future is going to be all about Intel and the other companies actually developing the technology. "Explain to me how your cpu will last" is kind of ridiculous when computer companies are still selling laptops with Core i5s and i7s made several years ago, and yet are entirely fast enough for what anybody needs along with whatever NVIDIA GPU they toss in there. Right now Intel is still profiting off of the CPUs that they made *years ago*, aside from idiots buying macbooks with Core i9s that thermal throttle and people building their own computers with Intel, we haven't even gotten to the point where new-generation Intel is in action. Meanwhile AMD is only capable of boasting about their new generations because their old generations suck even more ass. I'll admit that it's a bad thing that Intel CPUs from like 6 years ago are still perfectly viable, but that's only because of the anti-innovation that AMD is contributing to right now with Moore's law which Intel is trying to *avoid*. There is no "futureproofing", there is no future for the CPUs that we're using, we're using the same technology from 20 years ago, just with the incremental improvements. Intel is working on the innovation, AMD is not. You completely misunderstood the point what I was making. My point is that people with huge budgets that ARE NOT USING THE HIGHER POWER can still not waste that money on a powerful GPU that they won't even utilize when a 1050 Ti is perfectly viable for whatever the hell they are doing, and yes gaming and rendering actually does fall into the scope of what the 1050 Ti can handle, it's not an iGPU and for some reason people like to diss any CPU or GPU that isn't the brand new high-end thing as if it's already 15 years old and can only work Microsoft Office when it can realistically still run higher-end things. Conspicuous consumption is the term to use when people waste money on things they don't need just to be able to flaunt it. Not only do I not know what you're referring to when you say "high-ultra modern games", but there are endless ways to configure GPU performance even just putting aside the game's own graphics configurations. I'm not going to troubleshoot your computer but if other people are able to do what you can't do with the same GPU then they probably aren't just lying. If people are actually using software which demands more than whatever GPU they already have, then I'm not saying it's a waste of money to upgrade, you're misunderstanding my point again. I'm saying that people should not waste money on hardware that they WON'T use, people don't need an RTX 2080 to run a game believe it or not, there are people out there that actually have a realistic demand for an RTX 2080. it's not that you can't press buckling spring keys, it's that the pressure force required is not the best for actual competitive or casual but often fps gaming and that the clicky feel (trust me, i know the difference) is quite difficult to convert over to proper gaming while they were designed for typing. sure, you can get along fine with it but for people who don't want these types of keyboards just because they're "more durable" to have it because they want to dedicate themselves to something else is not the best. again, that's my point: it's not for everyone. also yes, compared to some keyboards, ibm models can have lesser quality and it's not ridiculous because it's true. the reason why i'm saying "who the fuck cares" is because we're not talking about funding: while, yes, intel is funding for future research - it doesn't make amd any less better than intel in the cpu industry. i don't understand why you keep bringing this up when we're not even talking about this (because i said future proofing? when i say future proofing i mean for the individual) hell, amd is doing lots of innovation when it comes to graphic cards and cpus - they've made 7mm architecture for graphics cards and so on. i admit, amd was ass 6 years ago, but it's now we're talking about and they're pumping out great cpus for the price and peformance and to downsize it because intel is making quantum technological innovation is stupid. we're talking about computer builds here - not quantum processing. i mean, i agree with you that people with huge budgets are not using the power they are paying for however we were talking about premintex's build, were we not? and so, i was not talking about huge budgets in the first place since that's why i recommend amd - if you have a high budget (2080-2080 ti) go and get nvidia, they're great. and same with getting a 1050 ti, if you have a low budget or do not need the power - get a 1050 ti. playing high-ultra on modern games is impossible with older hardware such as the 1050 ti and i can back that up from experience. if i have money and i spend most of my hobby time playing games, i would like to play games comfortably and with satisfaction - not lag and etc. turning down the graphics or even configuring the graphic configurations (these barely give 3-6 extra fps by the way) just doesn't feel right either. overclocking is one bit too far and i doubt regular people would do it (i overclock personally, but i have a well ventilated room). we're living in a time where people are buying ~$5000 builds to play fortnite or minecraft and that's it however, when people are buying budget builds they are trying to get the peformance they need for a low price. especially premintex and myself, i need peformance to get good fps on higher end games and so on. Why not use an IBM Windows keyboard in that case? It doesn't use buckling-spring but still preserves the same type of quality that a Model F or a Model M has, and there are other keyboards from that era not manufactured by IBM that also retain that quality. "Some keyboards" means nothing to me. I don't know what keyboards you would even be referring to in order to compare them. There is no such thing as "future proofing for the individual" when it comes to CPUs. I can use a Core i5 from 6 years ago or a brand new AMD Ryzen, the difference is that in 2 years the Core i5 will still hold up but the AMD Ryzen will already be completely phased out. There's a reason why seventh generation Intel from years ago is still something actively sold in computers when they could just be selling whatever the latest generation is, but nobody would buy an AMD CPU from a few years ago. Realistically it's all Moore's law trash, there's no future proofing. GPUs have future proofing which NVIDIA is leading with because there's still ongoing innovation and optimization, there is no future for Moore's law. It's either transition to quantum computing or ride out the remains of Moore's law, Intel is attempting to do the former, AMD is doing the latter. The issue is that you're confusing incrementalism for innovation, AMD is not being innovative by following Moore's law, they are just incrementing more than Intel is right now and that is a failure on AMD's part in the *long run*. Realistically, in the future, AMD will run out of infrastructure to work with because Moore's law itself is meeting its demise, and then they will be in the same state as Intel is in right now, except Intel has infrastructure for quantum computing and AMD won't. Therefore, AMD will have nothing to fallback onto, NVIDIA will have them beat on GPUs and Intel will have them beat on quantum computing infrastructure, that's where Intel is victorious. The point is that Intel is in the right place right now because they are the ones that are going to come out with infrastructure, innovation, and a real market. Then there's no disagreement. A GTX 1050 Ti is satisfactory for most purposes, and their higher end GPUs are better for people with bigger budgets that want to use more than a 1050 Ti can handle. The configurations certainly do work, just turn off ultra and use the regular setting and it will work perfectly. Not sure exactly what you expected, ultra settings are unnecessary for gaming when the normal settings give you more performance, the 1060 Ti can be found as cheap as a 1050 Ti and would work better if high settings were absolutely needed for some strange reason.
|
|
fionn
Club 4000 Member
Admin Officer
elmon sucks
Posts: 6,157
| Likes: 4,775
|
Post by fionn on Mar 30, 2020 22:07:31 GMT
it's not that you can't press buckling spring keys, it's that the pressure force required is not the best for actual competitive or casual but often fps gaming and that the clicky feel (trust me, i know the difference) is quite difficult to convert over to proper gaming while they were designed for typing. sure, you can get along fine with it but for people who don't want these types of keyboards just because they're "more durable" to have it because they want to dedicate themselves to something else is not the best. again, that's my point: it's not for everyone. also yes, compared to some keyboards, ibm models can have lesser quality and it's not ridiculous because it's true. the reason why i'm saying "who the fuck cares" is because we're not talking about funding: while, yes, intel is funding for future research - it doesn't make amd any less better than intel in the cpu industry. i don't understand why you keep bringing this up when we're not even talking about this (because i said future proofing? when i say future proofing i mean for the individual) hell, amd is doing lots of innovation when it comes to graphic cards and cpus - they've made 7mm architecture for graphics cards and so on. i admit, amd was ass 6 years ago, but it's now we're talking about and they're pumping out great cpus for the price and peformance and to downsize it because intel is making quantum technological innovation is stupid. we're talking about computer builds here - not quantum processing. i mean, i agree with you that people with huge budgets are not using the power they are paying for however we were talking about premintex's build, were we not? and so, i was not talking about huge budgets in the first place since that's why i recommend amd - if you have a high budget (2080-2080 ti) go and get nvidia, they're great. and same with getting a 1050 ti, if you have a low budget or do not need the power - get a 1050 ti. playing high-ultra on modern games is impossible with older hardware such as the 1050 ti and i can back that up from experience. if i have money and i spend most of my hobby time playing games, i would like to play games comfortably and with satisfaction - not lag and etc. turning down the graphics or even configuring the graphic configurations (these barely give 3-6 extra fps by the way) just doesn't feel right either. overclocking is one bit too far and i doubt regular people would do it (i overclock personally, but i have a well ventilated room). we're living in a time where people are buying ~$5000 builds to play fortnite or minecraft and that's it however, when people are buying budget builds they are trying to get the peformance they need for a low price. especially premintex and myself, i need peformance to get good fps on higher end games and so on. Why not use an IBM Windows keyboard in that case? It doesn't use buckling-spring but still preserves the same type of quality that a Model F or a Model M has, and there are other keyboards from that era not manufactured by IBM that also retain that quality. "Some keyboards" means nothing to me. I don't know what keyboards you would even be referring to in order to compare them. There is no such thing as "future proofing for the individual" when it comes to CPUs. I can use a Core i5 from 6 years ago or a brand new AMD Ryzen, the difference is that in 2 years the Core i5 will still hold up but the AMD Ryzen will already be completely phased out. There's a reason why seventh generation Intel from years ago is still something actively sold in computers when they could just be selling whatever the latest generation is, but nobody would buy an AMD CPU from a few years ago. Realistically it's all Moore's law trash, there's no future proofing. GPUs have future proofing which NVIDIA is leading with because there's still ongoing innovation and optimization, there is no future for Moore's law. It's either transition to quantum computing or ride out the remains of Moore's law, Intel is attempting to do the former, AMD is doing the latter. The issue is that you're confusing incrementalism for innovation, AMD is not being innovative by following Moore's law, they are just incrementing more than Intel is right now and that is a failure on AMD's part in the *long run*. Realistically, in the future, AMD will run out of infrastructure to work with because Moore's law itself is meeting its demise, and then they will be in the same state as Intel is in right now, except Intel has infrastructure for quantum computing and AMD won't. Therefore, AMD will have nothing to fallback onto, NVIDIA will have them beat on GPUs and Intel will have them beat on quantum computing infrastructure, that's where Intel is victorious. The point is that Intel is in the right place right now because they are the ones that are going to come out with infrastructure, innovation, and a real market. Then there's no disagreement. A GTX 1050 Ti is satisfactory for most purposes, and their higher end GPUs are better for people with bigger budgets that want to use more than a 1050 Ti can handle. The configurations certainly do work, just turn off ultra and use the regular setting and it will work perfectly. Not sure exactly what you expected, ultra settings are unnecessary for gaming when the normal settings give you more performance, the 1060 Ti can be found as cheap as a 1050 Ti and would work better if high settings were absolutely needed for some strange reason. Wilee it's down to personal preference. You might prefer a model m ibm keyboard, but we don't. Gaming Keyboards better suit what we mainly use computers for (gaming). We do not know what you get up to, but by your choice of keyboard, it's safe to say it's not gaming. We use RGB because it looks nicer, wilee. If you were picking wallpaper, one was more price-heavy but it looked nice, and one was a budget wallpaper but it was ugly, you'd go for the nicer one, it's natural. You might like the look and feel of the model m, but when you're gaming against people with these high-end keyboards and you are using a model f or a model m, chances are you're not going to win.
|
|
square
Veteran Member
Asst. Creative Designer
Posts: 1,294
| Likes: 1,291
|
Post by square on Mar 30, 2020 22:14:11 GMT
it's not that you can't press buckling spring keys, it's that the pressure force required is not the best for actual competitive or casual but often fps gaming and that the clicky feel (trust me, i know the difference) is quite difficult to convert over to proper gaming while they were designed for typing. sure, you can get along fine with it but for people who don't want these types of keyboards just because they're "more durable" to have it because they want to dedicate themselves to something else is not the best. again, that's my point: it's not for everyone. also yes, compared to some keyboards, ibm models can have lesser quality and it's not ridiculous because it's true. the reason why i'm saying "who the fuck cares" is because we're not talking about funding: while, yes, intel is funding for future research - it doesn't make amd any less better than intel in the cpu industry. i don't understand why you keep bringing this up when we're not even talking about this (because i said future proofing? when i say future proofing i mean for the individual) hell, amd is doing lots of innovation when it comes to graphic cards and cpus - they've made 7mm architecture for graphics cards and so on. i admit, amd was ass 6 years ago, but it's now we're talking about and they're pumping out great cpus for the price and peformance and to downsize it because intel is making quantum technological innovation is stupid. we're talking about computer builds here - not quantum processing. i mean, i agree with you that people with huge budgets are not using the power they are paying for however we were talking about premintex's build, were we not? and so, i was not talking about huge budgets in the first place since that's why i recommend amd - if you have a high budget (2080-2080 ti) go and get nvidia, they're great. and same with getting a 1050 ti, if you have a low budget or do not need the power - get a 1050 ti. playing high-ultra on modern games is impossible with older hardware such as the 1050 ti and i can back that up from experience. if i have money and i spend most of my hobby time playing games, i would like to play games comfortably and with satisfaction - not lag and etc. turning down the graphics or even configuring the graphic configurations (these barely give 3-6 extra fps by the way) just doesn't feel right either. overclocking is one bit too far and i doubt regular people would do it (i overclock personally, but i have a well ventilated room). we're living in a time where people are buying ~$5000 builds to play fortnite or minecraft and that's it however, when people are buying budget builds they are trying to get the peformance they need for a low price. especially premintex and myself, i need peformance to get good fps on higher end games and so on. Why not use an IBM Windows keyboard in that case? It doesn't use buckling-spring but still preserves the same type of quality that a Model F or a Model M has, and there are other keyboards from that era not manufactured by IBM that also retain that quality. "Some keyboards" means nothing to me. I don't know what keyboards you would even be referring to in order to compare them. There is no such thing as "future proofing for the individual" when it comes to CPUs. I can use a Core i5 from 6 years ago or a brand new AMD Ryzen, the difference is that in 2 years the Core i5 will still hold up but the AMD Ryzen will already be completely phased out. There's a reason why seventh generation Intel from years ago is still something actively sold in computers when they could just be selling whatever the latest generation is, but nobody would buy an AMD CPU from a few years ago. Realistically it's all Moore's law trash, there's no future proofing. GPUs have future proofing which NVIDIA is leading with because there's still ongoing innovation and optimization, there is no future for Moore's law. It's either transition to quantum computing or ride out the remains of Moore's law, Intel is attempting to do the former, AMD is doing the latter. The issue is that you're confusing incrementalism for innovation, AMD is not being innovative by following Moore's law, they are just incrementing more than Intel is right now and that is a failure on AMD's part in the *long run*. Realistically, in the future, AMD will run out of infrastructure to work with because Moore's law itself is meeting its demise, and then they will be in the same state as Intel is in right now, except Intel has infrastructure for quantum computing and AMD won't. Therefore, AMD will have nothing to fallback onto, NVIDIA will have them beat on GPUs and Intel will have them beat on quantum computing infrastructure, that's where Intel is victorious. The point is that Intel is in the right place right now because they are the ones that are going to come out with infrastructure, innovation, and a real market. Then there's no disagreement. A GTX 1050 Ti is satisfactory for most purposes, and their higher end GPUs are better for people with bigger budgets that want to use more than a 1050 Ti can handle. The configurations certainly do work, just turn off ultra and use the regular setting and it will work perfectly. Not sure exactly what you expected, ultra settings are unnecessary for gaming when the normal settings give you more performance, the 1060 Ti can be found as cheap as a 1050 Ti and would work better if high settings were absolutely needed for some strange reason. "Why not use an IBM Windows keyboard in that case? It doesn't use buckling-spring but still preserves the same type of quality that a Model F or a Model M has, and there are other keyboards from that era not manufactured by IBM that also retain that quality." list one ibm windows keyboard that is so high quality it competes with every single keyboard out there? ""Some keyboards" means nothing to me. I don't know what keyboards you would even be referring to in order to compare them." let's see: - ducky - filco - some corsair products (actually) like the k95 and literally any keyboard that doesn't take up the entire desk. people have preference and can't take old keyboards which are large due to space or sound reasons. "There is no such thing as "future proofing for the individual" when it comes to CPUs." wrong, people who do modern things such as render in 8k or play heavy modern titles need to upgrade or save space for the future. having a core i5 will not let most people who want the power have power, it works for you because you don't do what they do. "but nobody would buy an AMD CPU from a few years ago" Market Share i5 9600K Ryzen 5 3600 Ryzen 5 2600Market Share (trailing 30 days) 0.62% 3.47% 1.28% (https://cpu.userbenchmark.com)"ongoing innovation and optimization" and amd doesn't optimize their gpus for games? "The configurations certainly do work, just turn off ultra and use the regular setting and it will work perfectly" just because you don't want ultra doesn't mean others don't. i'd like some nice details in my game - same with quality of keyboards. if you don't want to spend more for a model m/f just get a membrane keyboard. simple as. you fail to understand that it's down to preference and needs.
|
|