Portal:Computer programming
![]() | Portal maintenance status: (September 2019)
|
The Computer Programming Portal
Computer programming or coding is the composition of sequences of instructions, called programs, that computers can follow to perform tasks. It involves designing and implementing algorithms, step-by-step specifications of procedures, by writing code in one or more programming languages. Programmers typically use high-level programming languages that are more easily intelligible to humans than machine code, which is directly executed by the central processing unit. Proficient programming usually requires expertise in several different subjects, including knowledge of the application domain, details of programming languages and generic code libraries, specialized algorithms, and formal logic.
Auxiliary tasks accompanying and related to programming include analyzing requirements, testing, debugging (investigating and fixing problems), implementation of build systems, and management of derived artifacts, such as programs' machine code. While these are sometimes considered programming, often the term software development is used for this larger overall process – with the terms programming, implementation, and coding reserved for the writing and editing of code per se. Sometimes software development is known as software engineering, especially when it employs formal methods or follows an engineering design process. (Full article...)
Selected articles - load new batch
-
Image 1
A Blender screenshot displaying Suzanne, a 3D test model
Computer graphics deals with generating images and art with the aid of computers. Computer graphics is a core technology in digital photography, film, video games, digital art, cell phone and computer displays, and many specialized applications. A great deal of specialized hardware and software has been developed, with the displays of most devices being driven by computer graphics hardware. It is a vast and recently developed area of computer science. The phrase was coined in 1960 by computer graphics researchers Verne Hudson and William Fetter of Boeing. It is often abbreviated as CG, or typically in the context of film as computer generated imagery (CGI). The non-artistic aspects of computer graphics are the subject of computer science research.
Some topics in computer graphics include user interface design, sprite graphics, raster graphics, rendering, ray tracing, geometry processing, computer animation, vector graphics, 3D modeling, shaders, GPU design, implicit surfaces, visualization, scientific computing, image processing, computational photography, scientific visualization, computational geometry and computer vision, among others. The overall methodology depends heavily on the underlying sciences of geometry, optics, physics, and perception.
Computer graphics is responsible for displaying art and image data effectively and meaningfully to the consumer. It is also used for processing image data received from the physical world, such as photo and video content. Computer graphics development has had a significant impact on many types of media and has revolutionized animation, movies, advertising, and video games, in general. (Full article...) -
Image 2
MATLAB (an abbreviation of "MATrix LABoratory") is a proprietary multi-paradigm programming language and numeric computing environment developed by MathWorks. MATLAB allows matrix manipulations, plotting of functions and data, implementation of algorithms, creation of user interfaces, and interfacing with programs written in other languages.
Although MATLAB is intended primarily for numeric computing, an optional toolbox uses the MuPAD symbolic engine allowing access to symbolic computing abilities. An additional package, Simulink, adds graphical multi-domain simulation and model-based design for dynamic and embedded systems.
As of 2020[update], MATLAB has more than four million users worldwide. They come from various backgrounds of engineering, science, and economics. As of 2017[update], more than 5000 global colleges and universities use MATLAB to support instruction and research. (Full article...) -
Image 3
The Greek lowercase omega (ω) character is used to represent Null in database theory.
In SQL, null or NULL is a special marker used to indicate that a data value does not exist in the database. Introduced by the creator of the relational database model, E. F. Codd, SQL null serves to fulfill the requirement that all true relational database management systems (RDBMS) support a representation of "missing information and inapplicable information". Codd also introduced the use of the lowercase Greek omega (ω) symbol to represent null in database theory. In SQL,NULL
is a reserved word used to identify this marker.
A null should not be confused with a value of 0. A null indicates a lack of a value, which is not the same as a zero value. For example, consider the question "How many books does Adam own?" The answer may be "zero" (we know that he owns none) or "null" (we do not know how many he owns). In a database table, the column reporting this answer would start with no value (marked by null), and it would not be updated with the value zero until it is ascertained that Adam owns no books.
In SQL, null is a marker, not a value. This usage is quite different from most programming languages, where a null value of a reference means it is not pointing to any object. (Full article...) -
Image 4
JavaScript (/ˈdʒɑːvəskrɪpt/ ⓘ), often abbreviated as JS, is a programming language and core technology of the World Wide Web, alongside HTML and CSS. Ninety-nine percent of websites use JavaScript on the client side for webpage behavior.
Web browsers have a dedicated JavaScript engine that executes the client code. These engines are also utilized in some servers and a variety of apps. The most popular runtime system for non-browser usage is Node.js.
JavaScript is a high-level, often just-in-time–compiled language that conforms to the ECMAScript standard. It has dynamic typing, prototype-based object-orientation, and first-class functions. It is multi-paradigm, supporting event-driven, functional, and imperative programming styles. It has application programming interfaces (APIs) for working with text, dates, regular expressions, standard data structures, and the Document Object Model (DOM). (Full article...) -
Image 5
Yoshua Bengio OC FRS FRSC (born March 5, 1964) is a Canadian-French computer scientist, and a pioneer of artificial neural networks and deep learning. He is a professor at the Université de Montréal and scientific director of the AI institute MILA.
Bengio received the 2018 ACM A.M. Turing Award, often referred to as the "Nobel Prize of Computing", together with Geoffrey Hinton and Yann LeCun, for their foundational work on deep learning. Bengio, Hinton, and LeCun are sometimes referred to as the "Godfathers of AI". Bengio is the most-cited computer scientist globally (by both total citations and by h-index), and the most-cited living scientist across all fields (by total citations). In 2024, TIME Magazine included Bengio in its yearly list of the world's 100 most influential people. (Full article...) -
Image 6Allen at the Flying Heritage Collection in 2013
Paul Gardner Allen (January 21, 1953 – October 15, 2018) was an American businessman, computer programmer, and investor. He co-founded Microsoft Corporation with his childhood friend Bill Gates in 1975, which was followed by the microcomputer revolution of the 1970s and 1980s. Allen was ranked as one of the richest people in American history by Forbes with an estimated net worth of $20.3 billion at the time of his death in October 2018.
Allen quit from day-to-day work at Microsoft in early 1983 after a Hodgkin lymphoma diagnosis, remaining on its board as vice-chairman. He and his sister, Jody Allen, founded Vulcan Inc. in 1986, a privately held company that managed his business and philanthropic efforts. At the time of his death, he had a multi-billion dollar investment portfolio, including technology and media companies, scientific research, real estate holdings, private space flight ventures, and stakes in other sectors. He owned the Seattle Seahawks of the National Football League and the Portland Trail Blazers of the National Basketball Association, and was part-owner of the Seattle Sounders FC of Major League Soccer. Under Allen's helm, the Seahawks won Super Bowl XLVIII and made it to two other Super Bowls (XL and XLIX). In 2000 he resigned from his position on Microsoft's board and assumed the post of senior strategy advisor to the company's management team.
Allen founded the Allen Institutes for Brain Science, Artificial Intelligence, and Cell Science, as well as companies like Stratolaunch Systems and Apex Learning. He gave more than $2 billion to causes such as education, wildlife and environmental conservation, the arts, healthcare, and community services. In 2004, he funded the first crewed private spaceplane with SpaceShipOne. He received numerous awards and honors, and was listed among the Time 100 Most Influential People in the World in 2007 and 2008. (Full article...) -
Image 7
Source code for a computer program written in the JavaScript language. It demonstrates the appendChild method. The method adds a new child node to an existing parent node. It is commonly used to dynamically modify the structure of an HTML document.
A computer program is a sequence or set of instructions in a programming language for a computer to execute. It is one component of software, which also includes documentation and other intangible components.
A computer program in its human-readable form is called source code. Source code needs another computer program to execute because computers can only execute their native machine instructions. Therefore, source code may be translated to machine instructions using a compiler written for the language. (Assembly language programs are translated using an assembler.) The resulting file is called an executable. Alternatively, source code may execute within an interpreter written for the language.
If the executable is requested for execution, then the operating system loads it into memory and starts a process. The central processing unit will soon switch to this process so it can fetch, decode, and then execute each machine instruction. (Full article...) -
Image 8The history of artificial intelligence (AI) began in antiquity, with myths, stories, and rumors of artificial beings endowed with intelligence or consciousness by master craftsmen. The study of logic and formal reasoning from antiquity to the present led directly to the invention of the programmable digital computer in the 1940s, a machine based on abstract mathematical reasoning. This device and the ideas behind it inspired scientists to begin discussing the possibility of building an electronic brain.
The field of AI research was founded at a workshop held on the campus of Dartmouth College in 1956. Attendees of the workshop became the leaders of AI research for decades. Many of them predicted that machines as intelligent as humans would exist within a generation. The U.S. government provided millions of dollars with the hope of making this vision come true.
Eventually, it became obvious that researchers had grossly underestimated the difficulty of this feat. In 1974, criticism from James Lighthill and pressure from the U.S.A. Congress led the U.S. and British Governments to stop funding undirected research into artificial intelligence. Seven years later, a visionary initiative by the Japanese Government and the success of expert systems reinvigorated investment in AI, and by the late 1980s, the industry had grown into a billion-dollar enterprise. However, investors' enthusiasm waned in the 1990s, and the field was criticized in the press and avoided by industry (a period known as an "AI winter"). Nevertheless, research and funding continued to grow under other names. (Full article...) -
Image 9
Portion of the calculating machine with a printing mechanism of the analytical engine, built by Charles Babbage, as displayed at the Science Museum (London)
The analytical engine was a proposed digital mechanical general-purpose computer designed by English mathematician and computer pioneer Charles Babbage. It was first described in 1837 as the successor to Babbage's difference engine, which was a design for a simpler mechanical calculator.
The analytical engine incorporated an arithmetic logic unit, control flow in the form of conditional branching and loops, and integrated memory, making it the first design for a general-purpose computer that could be described in modern terms as Turing-complete. In other words, the structure of the analytical engine was essentially the same as that which has dominated computer design in the electronic era. The analytical engine is one of the most successful achievements of Charles Babbage.
Babbage was never able to complete construction of any of his machines due to conflicts with his chief engineer and inadequate funding. It was not until 1941 that Konrad Zuse built the first general-purpose computer, Z3, more than a century after Babbage had proposed the pioneering analytical engine in 1837. (Full article...) -
Image 10In computer systems a loader is the part of an operating system that is responsible for loading programs and libraries. It is one of the essential stages in the process of starting a program, as it places programs into memory and prepares them for execution. Loading a program involves either memory-mapping or copying the contents of the executable file containing the program instructions into memory, and then carrying out other required preparatory tasks to prepare the executable for running. Once loading is complete, the operating system starts the program by passing control to the loaded program code.
All operating systems that support program loading have loaders, apart from highly specialized computer systems that only have a fixed set of specialized programs. Embedded systems typically do not have loaders, and instead, the code executes directly from ROM or similar. In order to load the operating system itself, as part of booting, a specialized boot loader is used. In many operating systems, the loader resides permanently in memory, though some operating systems that support virtual memory may allow the loader to be located in a region of memory that is pageable.
In the case of operating systems that support virtual memory, the loader may not actually copy the contents of executable files into memory, but rather may simply declare to the virtual memory subsystem that there is a mapping between a region of memory allocated to contain the running program's code and the contents of the associated executable file. (See memory-mapped file.) The virtual memory subsystem is then made aware that pages with that region of memory need to be filled on demand if and when program execution actually hits those areas of unfilled memory. This may mean parts of a program's code are not actually copied into memory until they are actually used, and unused code may never be loaded into memory at all. (Full article...) -
Image 11
Charles Babbage KH FRS (/ˈbæbɪdʒ/; 26 December 1791 – 18 October 1871) was an English polymath. A mathematician, philosopher, inventor and mechanical engineer, Babbage originated the concept of a digital programmable computer.
Babbage is considered by some to merit the title of "father of the computer". He is credited with inventing the first mechanical computer, the difference engine, that eventually led to more complex electronic designs, though all the essential ideas of modern computers are to be found in his analytical engine, programmed using a principle openly borrowed from the Jacquard loom. As part of his computer work, he also designed the first computer printers. He had a broad range of interests in addition to his work on computers covered in his 1832 book Economy of Manufactures and Machinery. He was an important figure in the social scene in London, and is credited with importing the "scientific soirée" from France with his well-attended Saturday evening soirées. His varied work in other fields has led him to be described as "pre-eminent" among the many polymaths of his century.
Babbage, who died before the complete successful engineering of many of his designs, including his Difference Engine and Analytical Engine, remained a prominent figure in the ideating of computing. Parts of his incomplete mechanisms are on display in the Science Museum in London. In 1991, a functioning difference engine was constructed from the original plans. Built to tolerances achievable in the 19th century, the success of the finished engine indicated that Babbage's machine would have worked. (Full article...) -
Image 12
D, also known as dlang, is a multi-paradigm system programming language created by Walter Bright at Digital Mars and released in 2001. Andrei Alexandrescu joined the design and development effort in 2007. Though it originated as a re-engineering of C++, D is now a very different language. As it has developed, it has drawn inspiration from other high-level programming languages. Notably, it has been influenced by Java, Python, Ruby, C#, and Eiffel.
The D language reference describes it as follows: (Full article...) -
Image 13COBOL (/ˈkoʊbɒl, -bɔːl/; an acronym for "common business-oriented language") is a compiled English-like computer programming language designed for business use. It is an imperative, procedural, and, since 2002, object-oriented language. COBOL is primarily used in business, finance, and administrative systems for companies and governments. COBOL is still widely used in applications deployed on mainframe computers, such as large-scale batch and transaction processing jobs. Many large financial institutions were developing new systems in the language as late as 2006, but most programming in COBOL today is purely to maintain existing applications. Programs are being moved to new platforms, rewritten in modern languages, or replaced with other software.
COBOL was designed in 1959 by CODASYL and was partly based on the programming language FLOW-MATIC, designed by Grace Hopper. It was created as part of a U.S. Department of Defense effort to create a portable programming language for data processing. It was originally seen as a stopgap, but the Defense Department promptly pressured computer manufacturers to provide it, resulting in its widespread adoption. It was standardized in 1968 and has been revised five times. Expansions include support for structured and object-oriented programming. The current standard is ISO/IEC 1989:2023.
COBOL statements have prose syntax such asMOVE x TO y
, which was designed to be self-documenting and highly readable. However, it is verbose and uses over 300 reserved words compared to the succinct and mathematically inspired syntax of other languages. (Full article...) -
Image 14
Computer class at Chkalovski Village School No. 2 in 1985–1986
The history of computing in the Soviet Union began in the late 1940s, when the country began to develop its Small Electronic Calculating Machine (MESM) at the Kiev Institute of Electrotechnology in Feofaniya. Initial ideological opposition to cybernetics in the Soviet Union was overcome by a Khrushchev era policy that encouraged computer production.
By the early 1970s, the uncoordinated work of competing government ministries had left the Soviet computer industry in disarray. Due to lack of common standards for peripherals and lack of digital storage capacity the Soviet Union's technology significantly lagged behind the West's semiconductor industry. The Soviet government decided to abandon development of original computer designs and encouraged cloning of existing Western systems (e.g. the 1801 CPU series was scrapped in favor of the PDP-11 ISA by the early 1980s).
Soviet industry was unable to mass-produce computers to acceptable quality standards and locally manufactured copies of Western hardware were unreliable. As personal computers spread to industries and offices in the West, the Soviet Union's technological lag increased. (Full article...) -
Image 15
F# (pronounced F sharp) is a general-purpose, high-level, strongly typed, multi-paradigm programming language that encompasses functional, imperative, and object-oriented programming methods. It is most often used as a cross-platform Common Language Infrastructure (CLI) language on .NET, but can also generate JavaScript and graphics processing unit (GPU) code.
F# is developed by the F# Software Foundation, Microsoft and open contributors. An open source, cross-platform compiler for F# is available from the F# Software Foundation. F# is a fully supported language in Visual Studio and JetBrains Rider. Plug-ins supporting F# exist for many widely used editors including Visual Studio Code, Vim, and Emacs.
F# is a member of the ML language family and originated as a .NET Framework implementation of a core of the programming language OCaml. It has also been influenced by C#,
Python, Haskell, Scala and Erlang. (Full article...)
Selected images
-
Image 1This image (when viewed in full size, 1000 pixels wide) contains 1 million pixels, each of a different color.
-
Image 2Stephen Wolfram is a British-American computer scientist, physicist, and businessman. He is known for his work in computer science, mathematics, and in theoretical physics.
-
Image 4A head crash on a modern hard disk drive
-
Image 5A lone house. An image made using Blender 3D.
-
Image 6GNOME Shell, GNOME Clocks, Evince, gThumb and GNOME Files at version 3.30, in a dark theme
-
Image 9A view of the GNU nano Text editor version 6.0
-
Image 10Partial view of the Mandelbrot set. Step 1 of a zoom sequence: Gap between the "head" and the "body" also called the "seahorse valley".
-
Image 11Ada Lovelace was an English mathematician and writer, chiefly known for her work on Charles Babbage's proposed mechanical general-purpose computer, the Analytical Engine. She was the first to recognize that the machine had applications beyond pure calculation, and to have published the first algorithm intended to be carried out by such a machine. As a result, she is often regarded as the first computer programmer.
-
Image 12Output from a (linearised) shallow water equation model of water in a bathtub. The water experiences 5 splashes which generate surface gravity waves that propagate away from the splash locations and reflect off of the bathtub walls.
-
Image 13Partial map of the Internet based on the January 15, 2005 data found on opte.org. Each line is drawn between two nodes, representing two IP addresses. The length of the lines are indicative of the delay between those two nodes. This graph represents less than 30% of the Class C networks reachable by the data collection program in early 2005.
-
Image 14An IBM Port-A-Punch punched card
-
Image 15Grace Hopper at the UNIVAC keyboard, c. 1960. Grace Brewster Murray: American mathematician and rear admiral in the U.S. Navy who was a pioneer in developing computer technology, helping to devise UNIVAC I. the first commercial electronic computer, and naval applications for COBOL (common-business-oriented language).
-
Image 16Margaret Hamilton standing next to the navigation software that she and her MIT team produced for the Apollo Project.
-
Image 17Deep Blue was a chess-playing expert system run on a unique purpose-built IBM supercomputer. It was the first computer to win a game, and the first to win a match, against a reigning world champion under regular time controls. Photo taken at the Computer History Museum.
Did you know? - load more entries

- ... that Phil Fletcher as Hacker T. Dog caused Lauren Layfield to make the "most famous snort" in the United Kingdom in 2016?
- ... that a "hacker" with blog posts written by ChatGPT was at the center of an online scavenger hunt promoting Avenged Sevenfold's album Life Is but a Dream...?
- ... that the programming language Acorn System BASIC was so non-standard that one commenter suggested that using it on the BBC Micro would be a disaster?
- ... that Cornell University's student-oriented programming language dialect was made available to other universities but required a "research grant" payment in exchange?
- ... that the 2024 psychological horror game Mouthwashing utilises non-diegetic scene transitions that mimic glitches and crashes?
- ... that the Gale–Shapley algorithm was used to assign medical students to residencies long before its publication by Gale and Shapley?
Subcategories
WikiProjects
- There are many users interested in computer programming, join them.
Computer programming news
- 23 May 2025 –
- Authorities from Canada, Denmark, France, Germany, the Netherlands, the United Kingdom, and the U.S. announce a joint operation to crack down on malware around the world, which took down over 300 servers, neutralized 650 domains, and seized over €3.5 million (US$3.9 million) of cryptocurrency. (DW)
Topics
Related portals
Associated Wikimedia
The following Wikimedia Foundation sister projects provide more on this subject:
-
Commons
Free media repository -
Wikibooks
Free textbooks and manuals -
Wikidata
Free knowledge base -
Wikinews
Free-content news -
Wikiquote
Collection of quotations -
Wikisource
Free-content library -
Wikiversity
Free learning tools -
Wiktionary
Dictionary and thesaurus