Jump to content

Wikipedia:Reference desk/Computing

From Wikipedia, the free encyclopedia
This is an old revision of this page, as edited by 93.136.155.134 (talk) at 00:49, 29 October 2019 (Windows 7 problems.). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.
Welcome to the computing section
of the Wikipedia reference desk.
Select a section:
Want a faster answer?

Main page: Help searching Wikipedia

   

How can I get my question answered?

  • Select the section of the desk that best fits the general topic of your question (see the navigation column to the right).
  • Post your question to only one section, providing a short header that gives the topic of your question.
  • Type '~~~~' (that is, four tilde characters) at the end – this signs and dates your contribution so we know who wrote what and when.
  • Don't post personal contact information – it will be removed. Any answers will be provided here.
  • Please be as specific as possible, and include all relevant context – the usefulness of answers may depend on the context.
  • Note:
    • We don't answer (and may remove) questions that require medical diagnosis or legal advice.
    • We don't answer requests for opinions, predictions or debate.
    • We don't do your homework for you, though we'll help you past the stuck point.
    • We don't conduct original research or provide a free source of ideas, but we'll help you find information you need.



How do I answer a question?

Main page: Wikipedia:Reference desk/Guidelines

  • The best answers address the question directly, and back up facts with wikilinks and links to sources. Do not edit others' comments and do not give any medical or legal advice.
See also:


October 20

Mysterious code

Why does this long line of code:

blanked potentially dangerous code

just print "Hello, world" to the console? What is this code actually doing behind the scenes? 69.5.123.5 (talk) 05:25, 20 October 2019 (UTC)[reply]

To start with, it's all in ASCII code. Start by decoding that, using our chart (the cyan column), here: ASCII#Printable_characters. Then maybe we can figure out what that's doing. SinisterLefty (talk) 05:32, 20 October 2019 (UTC)[reply]
It decodes to
Still confused. 69.5.123.5 (talk) 05:38, 20 October 2019 (UTC)[reply]
From what I decoded, it looks like more instructions, like "concat" and "eval", so apparently we have code within code within code. They must have been trying to obfuscate the meaning. Keep decoding all the levels. SinisterLefty (talk) 05:49, 20 October 2019 (UTC)[reply]
Code within code within code?
Apparently goes down 1 more level. 69.5.123.5 (talk) 05:53, 20 October 2019 (UTC)[reply]
And then decodes to console.log("Hello, world!") Wow! 69.5.123.5 (talk) 05:55, 20 October 2019 (UTC)[reply]
At least they didn't do this

That is 7 layers. 69.5.123.5 (talk) 05:57, 20 October 2019 (UTC)[reply]
Somebody had a lot of time to kill making that. SinisterLefty (talk) 05:59, 20 October 2019 (UTC)[reply]
Or a script to do the job for them. 69.5.123.5 (talk) 06:00, 20 October 2019 (UTC)[reply]
I once wrote a noughts and crosses game entirely in AmigaGuide. Since AmigaGuide is a static mark-up language like HTML or LaTeX and does not support state manipulation, the only way I could do it was to write out every single possible state in the same document. It was too much of an effort to do by hand, so I wrote a program to do it for me. JIP | Talk 21:21, 22 October 2019 (UTC)[reply]
@69.5.123.5, JIP, and SinisterLefty: this is a common set of techniques to obfuscate the code of JS based attacks. Where is this code coming from ? Where is it running ? Please NEVER run anything like this that you do not understand in your personal browser and report any existence of code like this to the owner of the website where you encounter it. —TheDJ (talkcontribs) 12:17, 28 October 2019 (UTC)[reply]
Even if it is just console.log Usually that means it is in preparation of a later attack. —TheDJ (talkcontribs) 12:18, 28 October 2019 (UTC)[reply]
I didn't run it. I manually translated some of the ASCII code. SinisterLefty (talk) 15:41, 28 October 2019 (UTC)[reply]

October 22

Sharing JavaScript / CSS files between projects in Visual Studio?

I am using Microsoft Visual Studio 2015 at work to develop an ASP.NET Core web application in C#, using JavaScript and CSS on the browser side.

Here's the thing. The system is actually a solution consisting of two separate ASP.NET Core web application projects, both in C#, using JavaScript and CSS. One of the projects has a fully working set of JavaScript and CSS files in the wwwroot folder. I'd like to use the same files in another project, but haven't found a way to refer to the files directly across projects so I could reuse them. In the end I resorted to copy-pasting the files as separate copies, but it would be easier to maintain if there were direct file references. Is this possible? JIP | Talk 20:01, 22 October 2019 (UTC)[reply]

The usual term for this is a library. Try searching for that term in their help files or online. SinisterLefty (talk) 20:09, 22 October 2019 (UTC)[reply]
Does this mean I can make my own library? Note that the JavaScript and CSS files in question are the company's own proprietary code, which I can use in source code form, but can't be published publicly online. JIP | Talk 21:19, 22 October 2019 (UTC)[reply]
Well, unless you actually intend to change them, the copy-and-paste method should be fine. If you do intend to change them, that's when you need to find a way to set up a library (you should also change the name to avoid confusion with their version). I don't know what native capabilities they have to set up a library, but if they don't support that, you could also do so yourself, by maintaining an outside folder with those files, making any changes there, then copy-and-pasting to the project(s) when any change is made. (You'd need to keep a list of which files are used in which projects in that folder, too.) SinisterLefty (talk) 21:49, 22 October 2019 (UTC)[reply]

October 23

Windows 7 problems.

Have 2 laptops that are Windows 7, and so many things about them not the same. 1st is a slow Windows 7 laptop at 2 GB ram, almost every time I turn it on, it loads Update.exe which is unclosable and slows everything down, I have to wait for it to go away. It mostly uses 100% usage in the task manager. Now my faster Windows 7 laptop is 8 GM RAM and boy it doesn't have that problem at all.. But that problem is it turns on on it's own in the middle of the night all the time. I have to raise the lid, then shut it, so it goes back on stand by. So pretty much every morning when I wake up, I hear it's engine running because it turned on on it's own from stand-by. I'll post a 3rd problem later. 67.175.224.138 (talk) 03:45, 23 October 2019 (UTC).[reply]

  • 1ST LAPTOP: Try turning off automatic updates. Get more RAM.
  • 2ND LAPTOP: Try turning off automatic updates and backups. You could unplug it and pull the battery out if that doesn't work.
Of course, you will want to do updates and backups manually, if not automatic. SinisterLefty (talk) 03:50, 23 October 2019 (UTC)[reply]
Updates aren't going to be much help as Windows 7 has been EOLed for all but "extended support" enterprise customers. I would say that no professional would recommend that you continue running Windows 7, particularly if you intend to use or connect to the public Internet. Elizium23 (talk) 04:33, 23 October 2019 (UTC)[reply]
What would be considered a "professional" by you? I connect to the public Internet to ask questions on Wikipedia desks, for example. 67.175.224.138 (talk) 05:03, 23 October 2019 (UTC).[reply]
A professional who makes a living providing support of the nature you're requesting. For example, I currently hold a CompTIA A+. If I were responsible for your IT support I'd have you unplugging your Ethernet cable -- or at least firewalling and isolating your network -- and locating a supportable OS as priority one. On a related note, have you installed a virus scanner, and is it up-to-date with its definitions database? Elizium23 (talk) 05:38, 23 October 2019 (UTC)[reply]
Yea Avira for virus-scanner. However, my old laptop often has a avscan.exe which is automatic and cannot close manually. I think the new has it too but not noticeable. 67.175.224.138 (talk) 05:48, 23 October 2019 (UTC).[reply]
Windows 7 is still fine for the casual home user. When extended support runs out in January, that might be the time to consider an upgrade. I went through this with Windows XP, and web sites stopped working a bit at a time after the extended support ended. But it was years before it became unusable (and you could still probably use it for Wikipedia). SinisterLefty (talk) 10:30, 23 October 2019 (UTC)[reply]
I can't stand the way Task Manager looks in Windows 10. So ugly and different. 67.175.224.138 (talk) 13:25, 23 October 2019 (UTC).[reply]
Yes, I object to "change for change's sake". And beyond that, any change will require a learning curve, so it can't just be a tiny bit better, it should be much better or else they should just leave it alone. When I switched to Windows 7 from Windows XP, I objected to the changes they made to MS Paint, so I just copied the executable over, and now I still have my "Invert colors" menu pull-down, and don't have to use a special keystroke (which also changed between versions). Perhaps you could do the same with the Task Manager. Not sure if that would work, though. SinisterLefty (talk) 19:23, 23 October 2019 (UTC)[reply]
I'd say it's worst for a "casual" home user. Any sticking with such an old version is going to forestall your ability to learn the newer versions, and it's going to make things difficult when you cast around for support. Unless you're running a voting machine in the USA, or you're an experienced administrator who can field support issues and manage to manually patch and mitigate security vulernabilities, I'd say get back on the upgrade train. It's probably not necessary to throw away the laptops, they may be rescued by an appropriate Linux install, FreeBSD, or possibly even a newer Windows, depending on their GPU in particular. Elizium23 (talk) 17:55, 23 October 2019 (UTC)[reply]
I don't think you're doing a proper cost/benefit analysis. What would it cost in time and money to upgrade to Windows 10, versus just waiting for Windows 11 or whatever comes next ? Now how will it cost them more to wait ? In a business environment, it's easy to see how that cost could be recouped, but as a casual home user, how exactly ? SinisterLefty (talk) 19:03, 23 October 2019 (UTC)[reply]
The cost/benefit is easy. There's always going to be a new version, so you can always pretend to justify waiting just until the latest and greatest comes out. But that's a fool's errand. The TCO (total cost of ownership) of an operating system (and hardware etc.) begins to skyrocket near its EOL point. Old protocols, old features, old bugs no longer supported nor fixed. New ones not coming in. Users abandoning the platform as it is obsolete. Mysterious crap begins to happen such as our interlocutor's support issues. The opportunity cost of not learning a new version is also great. My father knows this well; he's used some PD software to slap an XP interface onto his Windows 7 laptop and he grouses about newfangled stuff. He threw in the towel when their Vista machine crashed and commissioned me to install a new Windows 10 system, and educate them just enough to get email, cat videos, and a few memes going. The longer you cling to obsolete platforms, the harder it is to pry them from your fingers when the time comes. This is unfortunate, yes, I love vintage computers, give me a C-64 any day or an Atari 2600 to play games! But this is the cost of consumerism and the capitalist milieu we live in - there is a constant pressure from shareholders to increase their value by selling more and more stuff, and you don't sell more stuff without forcing obsolescence and constantly introducing new features. I consider it an honor to install a new Windows and learn the new stuff. It's another bullet point on my résumé when I can draw a swath across "Windows 3.1" all the way to "Windows 10" and "Server 201x" and say I am comfortable with them all. (If users can't afford to purchase an upgraded laptop, then the users should be asking themselves whether their budget allocation for IT was appropriate - they probably realize the need for new phones every 2 years or so already.) Elizium23 (talk) 21:07, 23 October 2019 (UTC)[reply]
You're still looking at it from the POV of a business. Those costs just don't apply to a home user. For example, if some game they play doesn't work anymore, they don't commission somebody to write a fix for it, they just stop playing that game. If a news or weather site stops working, they find another or get their news and weather from the TV, etc. And even the professionals have largely skipped some pathetic Windows versions, like Vista.
A casual home computer user could be compared with someone who rides a bicycle for fun. It doesn't make economic sense for them to replace their bike every time a new model comes out, but rather to wait until it fails to work. And even then, some minor things can go wrong, like the horn not working or some gear they don't use not working, and they can still get more usage out of it. You are looking at it from the POV of a competitive bike racer, who does indeed need all the newest technology, if he intends to win races. SinisterLefty (talk) 21:13, 23 October 2019 (UTC)[reply]
Someone who rides a bicycle for fun may well depend on that bicycle -- for entertainment. If the bicycle breaks then they may find their "fun" quotient unmet and it could lead to marital stress, weight gain, mood disorders, etc. Our questioner here today has some bugs that are preventing him from achieving computing goals. My concern is not what those goals are, but how they can best be achieved. He may be a gamer or an emailer or a desktop publisher (the latter two are very real productivity goals and more common than "someone who uses a computer/bicycle for fun"). Bicycles are not an accurate hardware analogy, because there are classes of bicycle which are immune to obsolescence, sturdy and last for decades. Now you mentioned a game that is beloved by the casual user. My mother had such a game. It lasted from oh, about Win3.1 until it finally died with Vista and I could not resuscitate it on Win10. Yes, she was very attached to that game. But she also pragmatically understood that it had been on borrowed time for years! Any "casual computer user" who becomes unduly attached to some feature, software, or hardware has transformed into a "vintage computer aficionado" and such usage ceases to be "casual" as soon as the raison-d'etre is playing that one game over and over. I can point you at some YouTube channels where aficionados are very devoted to restoring, upgrading, and augmenting vintage computer gear; they are anything but casual. A casual computer user is a consumer, and a consumer needs to have current, supported equipment in order to carry out their consumerism and stay relevant in the marketplace. In the 1980s, I listened to a Walkman clone. I have been a casual consumer of music since then. If I insisted on listening to tapes on that Walkman clone in 2019, I would be derided, and rightly so. Casual automobile drivers got their wakeup call from Barack Obama when Cash for Clunkers was instituted. We can't be attached to old crap that just hasn't broken enough yet. Consumer is as consumer does, and Best Buy/Newegg is thataway. Elizium23 (talk) 00:17, 24 October 2019 (UTC)[reply]
Even using your car example, it still doesn't make sense, either for the individual or the environment, to junk all current models whenever a new model comes out. So maybe the old car doesn't have an MP3 player and GPS, you can use portable devices for that. And if the use of the car was also casual, by somebody who could just as well use public transportation/Uber/Lyft to get anywhere, if they lacked a car, then it makes sense to keep it until it dies, and maybe not even replace or repair it then. Also, similar to PCs, old cars can actually have features you would miss on the new ones. "What, no CD player ? So now I have to convert all my CDs to MP3 ? That sucks !" SinisterLefty (talk) 02:11, 24 October 2019 (UTC)[reply]
SinisterLefty, regarding MSPaint, all you had to do was copy/paste MSPaint.exe from Windows XP to 7? Or and the entire folder with the .exe in it? And you most likely had to delete the 1 in 7 1st? Regarding Task Manager from Windows 7 to 8 or 10, does anyone know if it had less features? 67.175.224.138 (talk) 22:28, 23 October 2019 (UTC).[reply]
I just copied the XP version of "MSPaint.exe" file to my Windows 7 Desktop. I can start it by clicking there. Or, if I want the Windows 7 version, it's still available through the menu system.
One thing the Task Manager sorely needs as of Windows 7 is the ability to determine which application is the problem one I need to kill. For example, if my Google Chrome session hangs, I'd like to be able to kill the offending tab. I can't close it in Chrome because Chrome is hung up. I open the Task Manager and it shows maybe 20 Google Chrome processes, and I have no clue as to which process relates to which tab, so the best I can do is kill them at random, and hope the offending tab is closed before those I don't want to lose. SinisterLefty (talk) 22:39, 23 October 2019 (UTC)[reply]
Was that a feature added in Windows 10 TaskMan? 67.175.224.138 (talk) 23:14, 23 October 2019 (UTC).[reply]
Don't know. That was my example of an improvement useful enough to justify a change. SinisterLefty (talk) 01:50, 24 October 2019 (UTC)[reply]
Of course it wasn't. That problem however is easily fixed by installing Firefox or Pale Moon. 93.136.155.134 (talk) 00:49, 29 October 2019 (UTC)[reply]

3rd question: when I have documents like Microsoft Word or images, not all show up in the folder. Under Tools > Folder Options > View (tab), I have show hidden files and folders as selected. So how do I find these files? Apparently by the program itself. So on Word if I Ctrl O, type in "A" then all documents starting with A show up, even if it can't be seen in the C:\ folder. Weird. 67.175.224.138 (talk) 05:07, 23 October 2019 (UTC).[reply]

This doesn't apply to MS Word files or most image editors, but there are some applications where a "file" in a folder doesn't necessarily correspond with a "file" or "object" within the program. For example, one video format had each movie consist of multiple VOB files and an associated supporting index file. The application would read the index, and display the movie name(s) listed there, not the names of the VOB files. Once you selected a movie, it would then play the VOB files in the required order. (Think of them like reels of a film.) So, just looking at the contents of the folder in the File Manager (renamed "Windows Explorer" as an example of change for change's sake) might not tell you much. SinisterLefty (talk) 22:45, 23 October 2019 (UTC)[reply]

Casual computer users

Perhaps the age of casual computer users is actually dawning in a new way. What I mean is that with the smartphone being so ubiquitous, average people depend way less on a desktop/laptop computer for productivity, checking email, communication and document viewing in general. I heard numbers that indicated laptop and desktop markets are stagnant or even declining, because sometimes people just don't replace that desktop that broke or the old and busted laptop. For most years of the Wintel PC and Macintosh, even gamers were known to use their computer for bursts of productivity or communication or desktop publishing. The PC has been such a Swiss Army Knife that it was difficult to supplant, but the torch has finally been passed to the Smartphone. But I would be careful about applying the label "casual computer user" because it sounds dismissive and unsympathetic to users like 67.175.224.138 who is encountering some real issues getting in his/her way of being productive. A truly casual computer user is one whose lifestyle may be enhanced by computer usage, but whose day-to-day activities, finances, and communications would not be at all impeded by the loss of "the" home computer. And at least until the smartphone became king, that truly casual user was a rarity indeed. Elizium23 (talk) 00:30, 24 October 2019 (UTC)[reply]

Might I add I use a flip phone for a cell phone, so I really just use a computer to go to websites like Facebook, and make websites. Make websites editing raw coding like .html, which I can do on Notepad, then copy/paste to the web server. I generally need the os with the best Task Manager. 67.175.224.138 (talk) 01:37, 24 October 2019 (UTC).[reply]
I think you're finally starting to get it. But it's hardly a new thing. My Mom never used her Win 2000 PC for anything but playing solitaire. It certainly would not have been worth her upgrading the hardware and OS so she could play solitaire on Windows 10. When she could no longer play solitaire on it, due to a hard drive failure, that was fine, she had more time to do other things. This is a "casual user". SinisterLefty (talk) 01:59, 24 October 2019 (UTC)[reply]

MS Word dimming file

Okay this is probably a separate question, going back to Microsoft Word. Once I have a file open, in the My Computer folder, the "Filename.docx" becomes ~$Filenam.docx and the image icon is lighten to a dimmer color. That itself is not the file and sometimes the original file is lost as a display. It can only be opened by opening Word itself, or under a "recently opened files" that it auto-saves. Sigh. 67.175.224.138 (talk) 01:44, 24 October 2019 (UTC).[reply]

I broke it off as a separate Q. It sounds like it may be locking the file so that nobody else can read it while you are making changes. SinisterLefty (talk) 01:54, 24 October 2019 (UTC)[reply]
Yes I think it's a file locking mechanism to prevent Word from opening two instances of the same document for editing. A bit unsophisticated, but I remember these pseudo files have been around since the early days of MS Office. It may also be there to let Word know which documents were open and need recovery after Word crashes. The icon is lighter because it's a hidden file. Losing the original file should never happen tho, I haven't had that nasty bug since Word 97 at least. The file is not meant to be opened because it doesn't contain anything useful, definitely I don't think it has anything in it that could help reconstruct a lost file. It may be that when you double-click it Word opens the corresponding real file, I think the ~$ file contains the real file's name and some other metadata. 93.136.155.134 (talk) 00:45, 29 October 2019 (UTC)[reply]

2+2=4, is too much to ask?

Is there any programming language where a+a = 2a, and not some error message. C est moi anton (talk) 17:24, 23 October 2019 (UTC)[reply]

Just about all of them? I'm not aware of any programing language that can't add a number to itself.
Is there a specific problem you're having? ApLundell (talk) 17:35, 23 October 2019 (UTC)I[reply]
In most languages you have to write assignment statements differently than a math equation, something like "a = 2" then "a = 2*a". So, there are three specific differences, that only one variable can appear on the left side of the equals sign, that a symbol like "*" is needed to denote multiplication, and that the value of the variable is actually being set by the assignment, as opposed to just noting an equivalence for later use. For example, it would be nice if you could just enter the quadratic formula, have it ask you which variables are known, and solve for the unknowns, if possible (including complex number answers). But the reality is that it takes a quite a bit of coding to make it do that in most languages. SinisterLefty (talk) 19:08, 23 October 2019 (UTC)[reply]
Perhaps you are looking for a symbolic parser, like Matlab/Octave, where you can create a symbol "a" and then it will parse "a+a" to produce "2a". In reality, the symbolic parser is normally used to handle very large, confusing equations and calculate derivatives. 199.164.8.1 (talk) 17:53, 23 October 2019 (UTC)[reply]
Also something like Wolfram Alpha. Bubba73 You talkin' to me? 21:11, 23 October 2019 (UTC)[reply]
Yes, symbolic parser is the concept I was searching for.
I have in mind to type directlly 'a+a' and obtain 2a.
Wolfram Alpha can verify 'a + a = 2a'. Python not (unless you assign a value).C est moi anton (talk) 21:43, 23 October 2019 (UTC)[reply]
What you are probably looking for is a computer algebra system. Modern ones typically have some kind of programming language built-in. They are excellent for symbolic manipulation of formulas and sometimes even for prototyping algorithms, but typically less well suited to high-performance computing. Adding two numbers is a single machine instruction. Manipulating symbolic expressions is orders of magnitude slower. --Stephan Schulz (talk) 22:06, 23 October 2019 (UTC)[reply]
I use PARI/GP for number theory. There is an online version at https://pari.math.u-bordeaux.fr/gp.html. If you enter a+a then you get 2*a. (a+1)^3 gives a^3 + 3*a^2 + 3*a + 1. You can also go the other way in some cases. factor(a^3 + 3*a^2 + 3*a + 1) gives Mat([a + 1, 3]) which is PARI/GP's way to write (a+1)^3. PrimeHunter (talk) 22:48, 23 October 2019 (UTC)[reply]
That website says that the online version runs at about 1/4 normal speed. Do you know what type of computer is it running on? Bubba73 You talkin' to me? 01:10, 24 October 2019 (UTC)[reply]
Your own. The code is translated to JavaScript which is sent to your browser and runs there. PrimeHunter (talk) 10:43, 24 October 2019 (UTC)[reply]
There's also an Android app on F-Droid. 93.136.155.134 (talk) 00:38, 29 October 2019 (UTC)[reply]

Is there an app for Android 8.0 that allows you to set max charge ?

So for example, I could set it to only charge to 70% or 80%, to protect the battery. SinisterLefty (talk) 21:03, 23 October 2019 (UTC)[reply]

I am not aware of "protection" accorded by undercharging batteries, particularly modern LiIon ones. Could you elaborate on the nature of this protection? Elizium23 (talk) 00:20, 24 October 2019 (UTC)[reply]
"Most Li-ions charge to 4.20V/cell, and every reduction in peak charge voltage of 0.10V/cell is said to double the cycle life. For example, a lithium-ion cell charged to 4.20V/cell typically delivers 300–500 cycles. If charged to only 4.10V/cell, the life can be prolonged to 600–1,000 cycles; 4.0V/cell should deliver 1,200–2,000 and 3.90V/cell should provide 2,400–4,000 cycles. ... In terms of longevity, the optimal charge voltage is 3.92V/cell." [batteryuniversity.com/index.php/en/learn/article/how_to_prolong_lithium_based_batteries] 01:42, 24 October 2019 (UTC)
From that source, it would help but there are other problems that you get. Voltage levels and charge level are not linear in Li-ion: [batteryuniversity.com/learn/article/charging_lithium_ion_batteries]
So at 3.9V you'd have 20% less charge as well, resulting in charging more often and deeper charge cycles which are both bad for the number of cycles. I think that optimizing battery life is less simple than just dropping the charge voltage. Rmvandijk (talk) 07:43, 24 October 2019 (UTC) (note: I am not an electrical engineer)[reply]
In my case there's little risk of using up most of the charge. So, I'd like to charge it when it's at 20-30% and stop it charging at 70-80%. SinisterLefty (talk) 10:27, 24 October 2019 (UTC)[reply]
Yeah there are a lot of those. Try the search function on f-droid.org whose apps are less likely to put spyware on your phone. 173.228.123.207 (talk) 01:36, 27 October 2019 (UTC)[reply]
Thanks. I only found one there, named "Battery Charge Limit". Are there more ? That one didn't work because it said my phone must be "rooted". If this isn't something I can do, then I guess none of the apps which stop charging will work. The next best thing is one that will at least notify me when I hit the preset limit. I downloaded one of those (using Google Play): "Battery Charge Notifier". SinisterLefty (talk) 10:57, 27 October 2019 (UTC)[reply]

JS code

Why did:

while(1)((l=>(l.href=URL.createObjectURL(new Blob([Math.floor(256*Math.random())])))||l.click())(document.createElement("a")));

cause my browser to lag severely? 69.5.123.121 (talk) 23:28, 23 October 2019 (UTC)[reply]

while(1) is an infinite loop. RudolfRed (talk) 23:59, 23 October 2019 (UTC)[reply]
There are times when an infinite loop is OK, if you check for some condition that causes you to exit the loop. But even then, you want to ensure that you don't keep checking too often, as that can take up too much CPU time. In such cases, a WAIT statement or equivalent can slow it down to a reasonable portion of the CPU time available. SinisterLefty (talk) 02:20, 24 October 2019 (UTC)[reply]
Just to clarify the terminology: it's only an infinite loop if there's no way for it to exit. Writing "while(1)" at the top of a loop may produce an infinite loop, but it certainly doesn't have to. --76.69.116.4 (talk) 05:08, 27 October 2019 (UTC)[reply]
blanked potential unsafe code

also caused severe lag but also prints out something in Chinese over and over. What's going on? I don't see any while statement in this one. 69.5.123.43 (talk) 11:09, 27 October 2019 (UTC)[reply]

Why are you asking here and who is providing this code to you and why are you executing code you do not understand ? Note to other editors. Don't EVER execute any code that you don't know or don't understand, it might be a potential hacking attempt by someone trying to get access to your Wikipedia account. —TheDJ (talkcontribs) 12:07, 28 October 2019 (UTC)[reply]

October 24

Keyboard remapping in Win 7

I would like to remap a single key to produce multiple keystrokes, such as having "." on the numeric keypad produce ".com". Do any keyboard mapping programs allow this ? Also, could I remap a key to do that and click on the mouse, too ? SinisterLefty (talk) 02:51, 24 October 2019 (UTC)[reply]

AutoHotkey (documentation) lets you create hotkeys to type certain lines of text, run a program or batch file, or create shortcuts for specific programs. You can even have one hotkey perform multiple actions in a series.[1]. DroneB (talk) 09:29, 24 October 2019 (UTC)[reply]
Thanks. Any other options to consider ? SinisterLefty (talk) 05:55, 28 October 2019 (UTC)[reply]
Just learn AHK :) It's so versatile I can't imagine using Windows without it. 93.136.155.134 (talk) 00:28, 29 October 2019 (UTC)[reply]
Here's a working script that prints ".com" when pressing ".":
~.::
Send com
Return  — Preceding unsigned comment added by 93.136.155.134 (talk) 00:35, 29 October 2019 (UTC)[reply] 

October 26

Dual GPU and Dual CPU: Should I put the two GPUs on the same CPU or on different CPUs?

I am planning a PC for use as a CAD workstation.
The first priority is CAD, but It wouldn't break my heart if it played games well.
My question: Should I put the two GPUs on the same CPU or on different CPUs?
I have searched, but could not find anyone who has tried it both ways.

Specs:
Gigabyte C621-WD12 motherboard.
(Half the PCIE stots are on CPU1, the other half are on CPU2)
Two Xeon Gold 6244 processors.
Two Nvidia Quadro RTX 8000 GPUs.
192GB RAM.
--Guy Macon (talk) 01:34, 26 October 2019 (UTC)[reply]

I would guess that 1 GPU per CPU would work out better. But get those stoats out of there before they chew through any wires ! :-) SinisterLefty (talk) 02:02, 26 October 2019 (UTC)[reply]
It's hard to make an educated guess. If the part of the program that issues video commands program runs on one CPU I would guess that same CPU would win. If multiple threads are issuing video commands I would guess that separate CPUs would win. Then there is the aspect of some programs offloading computation work to the GPUs, which might act differently from using the GPUs to generate video. --Guy Macon (talk) 04:03, 26 October 2019 (UTC)[reply]
I have experience with dual CPUs but not dual graphics cards. You might read this. It wouldn't be hard to try both setups and benchmark. Bubba73 You talkin' to me? 02:17, 26 October 2019 (UTC)[reply]
I had just read that before posting here. Unlike the case with gaming, we know that multiple Nvidia Quadro cards (the Quadro is optimized for CAD) are always worth having wnen you are doing high-end CAD work. I was surprised that nobody seems to have ever compared the dual GPU same CPU and dual GPU different CPU configuration for gaming or cad. --Guy Macon (talk) 04:03, 26 October 2019 (UTC)[reply]
They mention 3D gaming, but not CAD, so I didn't know if the program could benefit from dual GPUs. Bubba73 You talkin' to me? 04:30, 26 October 2019 (UTC)[reply]
Presumably the same caveats apply as to games. That is, rotating a 3D shaded model could benefit from 2 GPUs if the CAD system is smart enough to have each GPU render every other frame, but if poorly coordinated, it could end up worse than one GPU. Thus, you would need to look at reviews for that particular CAD system to see how well it handles dual GPUs. Surprisingly, they didn't mention the problem of heat. At the 500 watts mentioned, that's a lot of heat to dissipate. The onboard fans also look like they would be less effective if the GPUs were in adjacent slots. SinisterLefty (talk) 13:14, 26 October 2019 (UTC)[reply]
Heat and CAD is an interesting trade-off. A lot of times you will look at a CAD card like the Quadro and see that there is a gaming card with similar specs for a third of the price. But the first time you tell your PC to spend all night rendering or autorouting you see the difference; the gaming GPU throttles as it overheats while the CAD GPU stays at full speed. Drivers are also an interesting aspect. There are drivers that are optimized for the major CAD GPUs included with most high-end mechanical CAD systems, and they invariably work well when you put in multiple cards. Alas, I work with electronic CAD, and driver support for autorouting PC Boards is often a bit spotty compared to driver software for rendering mechanical designs.
Once all the parts arrive and I put together the new system, I will try various configurations and see what performs best. Other trade-offs are crazy expensive to investigate. For example, consider these two CPU choices:
  • Xeon Gold 6244: $2925.00 ea., 8 Cores 16 Threads 3.60 GHz all cores/4.40 GHz single core
  • Xeon Gold 6240Y $2726.00 ea., 18 Cores 36 Threads 2.60 GHz all cores/3.90 GHz single core
It would cost me an extra $6000 to try both in a dual CPU system. On my current system what I do usually ends up running on 10-20 cores, so 16 cores at 3.60 GHz seems like a better choice than 36 cores at 2.60 GHz. But that is just a guess. Maybe the CAD will use a lot more cores if they are available.
Granted, the money people are willing to pay for a good PC board design makes these kind of high end processors pay for themselves, but is whatever benefit I might get by knowing which CPU is best instead of guessing worth $6K? No. --Guy Macon (talk) 17:51, 26 October 2019 (UTC)[reply]
OK, so you don't already have the hardware. And the best number of cores to run on can be tricky. If it is CPU bound, by guess is that the higher number of cores at a slower speed would be best, if it can use them all. If memory bandwidth is a bottleneck, then likely the reverse. I was testing programs on systems with two 8-core Xeons (hyperthreaded), running different numbers of threads. In CPU-intensive stuff, 32 threads was the best. But on memory-intensive stuff, the perfomance went down for more than 16 threads because of non-uniform memory access, from one Xeon to the memory of the other. Bubba73 You talkin' to me? 21:46, 26 October 2019 (UTC)[reply]
At those prices, it's easy to see why testers don't try out every possible configuration and report the results.
Are you able to put the GPUs as far apart on the same CPU as on different ones ? If not, then the heat issue may well make the difference.
Also, if you want to do rendering while away from the PC, versus real-time, then you might do better to go with different PCs entirely (2 or more, including your current system as one). You could go with lower-priced PCs, and run multiple renderings at once (one per PC). We really don't know how well your CAD system handles multiple GPUs, whether on the same CPU or not, but we can guarantee that you can use two GPUs on two different PCs without any type of "collision". Plus you wouldn't be completely dead in the water if a PC dies on you, and the heat from a pair of GPUs will be less of an issue in two different boxes. However, you would need to consider if you would be required to buy an extra license(s) for the CAD system or if it allows a few copies at the same site for this purpose. You would also need a network that would allow you to transfer the completed renderings quickly, but I am guessing you already have that (presumably storing backups on a standalone hard drive). A KVM switch would eliminate the need to buy duplicate monitors, keyboards, and mice. SinisterLefty (talk) 19:52, 26 October 2019 (UTC)[reply]
I will look into that. I have been using "rendering" as a verbal shorthand for "stuff the GPU does at night while I sleep", but in reality what it is doing is constantly ripping up and retrying different designs for printed circuit boards. I often end up autorouting a four-layer version and a six layer version and then running electrical simulation software on each to see what my noise margins are. Two PCs would be great for doing that. Good suggestion. --Guy Macon (talk) 20:58, 26 October 2019 (UTC)[reply]
Guy Macon, I strongly recommend each GPU be driven by it's own CPU. This allows both GPUs to be serviced at the same time, with each CPU handling each GPU independently.
Note: I am not aware if Windows can actually take advantage of this. I do know that Linux can (And does so quite well with NVIDIA's proprietary drivers.. when they work, anyways.) MoonyTheDwarf (Braden N.) (talk) 18:12, 26 October 2019 (UTC)[reply]
Further note: This does depend on what software you're using. The software itself has to be smart enough to drive the GPUs independently. IF it is not, then it will make minimal difference. MoonyTheDwarf (Braden N.) (talk) 18:13, 26 October 2019 (UTC)[reply]
Thanks! I was thinking the same thing. A lot of the time it turns out that the GPU ends up loafing because the CPU can't feed it work fast enough. I have uses a lot of multicore PCs, but I haven't used multiple processors since the Pentium Pro days, and I have never used multiple GPUs on a system I own (they are common on engineering workstations in places I have worked). Hey, nothing like relearning everything all over again! If only I could erase what I know about 6502 assembly language on a Commodore 64 to make room for the new stuff... :)   BTW, I spend around 25% of my time in Windows 10 and 75% in Slackware Linux, except when I am on a job site in China, where I do everything using Tails Linux on a locally-purchased PC. --Guy Macon (talk) 20:46, 26 October 2019 (UTC)[reply]

The Quadro isn't that much different from the GTX/RTX aside from being a lot more expensive and omitting bogus software license restrictions that supposedly don't allow you to use the gaming versions in a data center. If you're building this workstation for home, that won't affect you. And with that much cash going to the CAD system, you can afford a separate box for gaming, which will make life simpler in terms of software hygeine etc. Finally unless you're dead set on that specific hardware, check the new AMD stuff, including the forthcoming Threadrippers that should be announced in the next few weeks. The GPU's are another matter: the AMD hardware is getting competitive with NVidia again, but the software isn't really there yet other than for gaming. 173.228.123.207 (talk) 01:35, 27 October 2019 (UTC)[reply]

A lot of people report a different experience. See https://forums.tomshardware.com/threads/solidworks-gaming-pc.789000/#post-6443286 as one example. I personally have experienced the "called the CAD vendor tech support, get told 'call us back when you are running on approved hardware and drivers' " effect. I am not a big gamer. A bit of minecraft or stockfish, maybe, when I am dead in the water waiting for something to happen on the job. but when I do electronic design, it pays well enough to make it worth my while to not only have the best CAD workstation I can get, but to have an identical spare system and good backups so I can switch over in less than half an hour. No, it is Nvidia Quadro, Intel Xeon, and a Gigabyte motherboard optimized for CAD instead of gaming for me. --Guy Macon (talk) 06:15, 27 October 2019 (UTC)[reply]
(EC, written before Guy Macon's latest reply) While it's true that a most of the differences between the Nvidia Quadro and Nvidia GeForce lines arise from marketing segmentation, you can't get ECC except on Quadro. (Although I believe the RTX 8000 only has it on DRAM, see [2] and the later link.) Since the OP appears to be planning on using the cards for calculations where continuous accuracy in result matters to them, rather than just using them for display or stuff where you don't care about a rare error (like mining), ECC is likely to be of interest although I suspect the OP already knows this. The RTX 8000 also generally comes with 48 GiB I believe. The most you can get with a Geforce card is 24 GiB (well in theTitan RTX, which is almost the same thing as the RTX 6000 except for ECC and other market segmentation differences). Certain Quadro's although not the ones the OP is looking at, also have far better double precision floating point performance [3]. Finally, AFAIK it's very difficult, if possible, to use the Quadro drivers with Geforce cards. Even the Titan RTX. You use to be able to either hack the drivers or flash a Quadro BIOS, but I think this doesn't generally work nowadays probably because there are enough differences unlike in the past where often the cards were really the same thing. A fair amount of workstation software including CAD software is built around the professional cards and drivers with minimal testing and support and definitely no certification for the Geforce drivers and cards. Nvidia may also artificial limit features on Geforce drivers see [4] for one no longer correct example irrelevant to the OP.) I'm not sure if this applies to the OP's current plans, but it's something they've brought up before so I wouldn't be surprised if it does. Actually, there's a resonable chance some AMD professional cards will be a better bet than Geforce ones for certain use cases (where CUDA support doesn't matter obviously). Don't get me wrong, I'm not saying you should never use the Geforce cards in a non gaming setting. There are plenty of cases where it makes sense. Even more so if you're just doing something on the side. Also, if interacting directly with the card, probably the drivers aren't quite as important. But the OP's comments suggest this doesn't apply to them and the risks and potential pitfalls from spending half or whatever on the GPUs probably isn't worth it. (I'm confident the OP already knows this, but I felt it may be helpful to explain why it probably doesn't make sense for them.) Nil Einne (talk) 06:51, 27 October 2019 (UTC)[reply]
The above describes my thinking pretty much exactly. I never really looked at the specs on the gaming cards -- I knew that the driver support isn't there -- and thus didn't notice the lack of ECC, which alone would be a deal breaker for me. --Guy Macon (talk) 07:14, 27 October 2019 (UTC)[reply]

October 28

Has Yahoo been hacked?

I've lately received strange e-mails. They look like they're coming from people who I'm connected with in Facebook or LinkedIn, but the contents consist of a suspicious-looking link to a URL shortening service, which I think leads to some malicious site. There's no personal message, just the link. These e-mails look like they're coming from people with yahoo.es addresses. Has Yahoo been hacked? Or why do I keep getting these e-mails seemingly from people I know, with yahoo.es addresses? JIP | Talk 11:05, 28 October 2019 (UTC)[reply]

The sender address is one of most easily spoofed parts of an e-mail message. IMHO somebody rather hacked FB to retrieve correlation between email adresses (e.g. from lists of friends and watchers.) If you suspect the messages are malicious you did perfectly by not following the links. You can also ask apparent sender(s) whether they actually sent you those messages or not. But take care to use an independent channel, i.e. not the email address from the suspicious messages. --CiaPan (talk) 11:36, 28 October 2019 (UTC)[reply]
Facebook and LinkedIn do not, by default, protect your privacy. If I look at your public profile, it will list some of your friends, with their names. So, I can easily use that to cram a name you know in the sender field. As noted, "sender" is easy to fake because it isn't checked. Whatever you put in there is what is sent. This is like sending a letter in the mail. I put anything I like in the upper left corner. The post office won't reject it. 135.84.167.41 (talk) 13:06, 28 October 2019 (UTC)[reply]
I know the sender field can be faked. That's why people get all these "I'm the widow of the deceased Nigerian prince" e-mails. But I was curious about where the people sending these e-mails get the names of people I'm connected with from. JIP | Talk 13:22, 28 October 2019 (UTC)[reply]
Some websites (I am looking at you, Facebook) sell lists of who you are connected to. These sites (still looking at you, Facebook) often gather this info and about you sell it even if you have never personally accessed their website. --Guy Macon (talk) 15:32, 28 October 2019 (UTC)[reply]