What is a Daemon? and other psuedo-mystical internet nomenclature that you've never understood (or heard of)
In Unix and other computer multitasking operating systems, a daemon (pronounced /ˈdeɪmən/ or /ˈdiːmən/) is a computer program that runs in the background, rather than under the direct control of a user; they are usually initiated as background processes.
Commonly Daemons are mistakenly believed to be an acronym for Disk And Execution MONitor. According to the original team that introduced the concept, "the use of the word daemon was inspired by the Maxwell's demon of physics and thermodynamics (an imaginary agent which helped sort molecules with differing velocities and worked tirelessly in the background)" thus evading the Laws of Thermodynamics. The earliest use appears to have been in the phrase "daemon of Socrates", which meant his "guiding or indwelling spirit; his genius", also a pre-Christian equivalent of the "Guardian Angel", or, alternatively, a demigod (who bears only an etymological connection to the word "demon").
Typically daemons have names that end with the letter "d" (e.g., syslogd, the daemon that handles the system log, or sshd, which handles incoming SSHconnections).
Within a Unix environment, the parent process of a daemon is frequently—although not always—the init process (PID=1). Processes usually become daemons by forking a child process and then having their parent process immediately exit, thus causing init to adopt the child process. This is a somewhat simplified view of the process as other operations are generally performed, such as dissociating the daemon process from any controlling tty. Convenience routines such as daemon(3) exist in some UNIX systems for that purpose.
Systems often start (or "launch") daemons at boot time: they often serve the function of responding to network requests, hardware activity, or other programs by performing some task. Daemons can also configure hardware (like udevd on some GNU/Linux systems), run scheduled tasks (like cron), and perform a variety of other tasks.
2. The Darknet:
The darknet refers to any one type of closed, private group of people communicating; however, since 2002, the term has evolved to more specifically refer to file sharing networks in general, whether that network is private or readily accessible to the public. The phrase "the darknet" is used to refer collectively to all covert communication networks. Usually, this term is derogatory, used in relation to illegal hacking or file-sharing.
3. Oracle Database
The Oracle Database (commonly referred to as Oracle RDBMS or simply as Oracle) is an object-relational database management system (ORDBMS) produced and marketed by Oracle Corporation.
Larry Ellison and his friends and former co-workers Bob Miner and Ed Oates started the consultancy Software Development Laboratories (SDL) in 1977. SDL developed the original version of the Oracle software. The name Oracle comes from the code-name of a CIA-funded project Ellison had worked on while previously employed by Ampex.
4. Web Spiders
A Web Spider is a computer program that browses the World Wide Web in a methodical, automated manner or in an orderly fashion. Other terms for Web crawlers are ants, automatic indexers, bots, Web spiders, Web robots, or—especially in the FOAF community—Web scutters.
This process is called Web crawling or spidering. Many sites, in particular search engines, use spidering as a means of providing up-to-date data. Web crawlers are mainly used to create a copy of all the visited pages for later processing by a search engine that will index the downloaded pages to provide fast searches. Crawlers can also be used for automating maintenance tasks on a Web site, such as checking links or validating HTML code. Also, crawlers can be used to gather specific types of information from Web pages, such as harvesting e-mail addresses (usually for sending spam).
A Web crawler is one type of bot, or software agent. In general, it starts with a list of URLs to visit, called the seeds. As the crawler visits these URLs, it identifies all the hyperlinks in the page and adds them to the list of URLs to visit, called the crawl frontier. URLs from the frontier are recursively visited according to a set of policies.
5. Zombie Computer
A zombie computer (often shortened as zombie) is a computer connected to the Internet that has been compromised by a cracker, computer virus or trojan horse and can be used to perform malicious tasks of one sort or another under remote direction. Botnets of zombie computers are often used to spread e-mail spam and launch denial-of-service attacks. Most owners of zombie computers are unaware that their system is being used in this way. Because the owner tends to be unaware, these computers are metaphorically compared to zombies.
A cookie, also known as a web cookie, browser cookie, and HTTP cookie, is a piece of text stored on a user's computer by their web browser. A cookie can be used forauthentication, storing site preferences, shopping cart contents, the identifier for a server-based session, or anything else that can be accomplished through storing text data.
A cookie consists of one or more name-value pairs containing bits of information, which may be encrypted for information privacy and data security purposes. The cookie is sent as a field in the header of the HTTP response by a web server to a web browser and then sent back unchanged by the browser each time it accesses that server.
Cookies may be set by the server with or without an expiration date. Cookies without an expiration date exist until the browser terminates, while cookies with an expiration date may be stored by the browser until the expiration date passes. Users may also manually delete cookies in order to save space or to address privacy issues.
Most modern browsers allow users to decide whether to accept cookies, and the time frame to keep them, but rejecting cookies makes some websites unusable.