The Common Vulnerability Scoring System (CVSS)

The Common Vulnerability Scoring System (CVSS) is a free and open industry standard for assessing the severity of computer system security vulnerabilities. CVSS attempts to assign severity scores to vulnerabilities, allowing responders to prioritize responses and resources according to threat. Scores are calculated based on a formula that depends on severalmetrics that approximate ease of exploit and the impact of exploit. Scores range from 0 to 10, with 10 being the most severe. While many utilize only the CVSS Base score for determining severity, Temporal and Environmental scores also exist, to factor in availability of mitigations and how widespread vulnerable systems are within an organization, respectively.

The current version of CVSS is CVSSv3.0, released in June 2015


Tor is free software for enabling anonymous communication. The name is an acronym derived from the original software project name The Onion Router,[7] however the correct spelling is “Tor”, capitalizing only the first letter.[8] Tor directs Internet traffic through a free, worldwide, volunteer network consisting of more than seven thousand relays[9] to conceal a user’s location and usage from anyone conducting network surveillance or traffic analysis. Using Tor makes it more difficult for Internet activity to be traced back to the user: this includes “visits to Web sites, online posts, instant messages, and other communication forms”.[10] Tor’s use is intended to protect the personal privacy of users, as well as their freedom and ability to conduct confidential communication by keeping their Internet activities from being monitored.

Onion routing is implemented by encryption in the application layer of a communication protocol stack, nested like the layers of anonion. Tor encrypts the data, including the destination IP address, multiple times and sends it through a virtual circuit comprising successive, randomly selected Tor relays. Each relay decrypts a layer of encryption to reveal only the next relay in the circuit in order to pass the remaining encrypted data on to it. The final relay decrypts the innermost layer of encryption and sends the original data to its destination without revealing, or even knowing, the source IP address. Because the routing of the communication is partly concealed at every hop in the Tor circuit, this method eliminates any single point at which the communicating peers can be determined through network surveillance that relies upon knowing its source and destination.

An adversary might try to de-anonymize the user by some means. One way this may be achieved is by exploiting vulnerable software on the user’s computer.[11] The NSA has a technique that targets outdated Firefox browsers codenamed EgotisticalGiraffe,[12] and targets Tor users in general for close monitoring under its XKeyscore program.[13] Attacks against Tor are an active area of academic research,[14][15] which is welcomed by the Tor Project itself.[16]

Computer emergency response teams (CERT)

Computer emergency response teams (CERT) are expert groups that handle computer security incidents. Alternative names for such groups include computer emergency readiness team and computer security incident response team (CSIRT).

The name “Computer Emergency Response Team” was first used by the CERT Coordination Center (CERT-CC) at Carnegie Mellon University (CMU). The abbreviation CERT of the historic name was picked up by other teams around the world. Some teams took on the more specific name of CSIRT to point out the task of handling computer security incidents instead of other tech support work, and because CMU was threatening to take legal action against individuals or organisations who referred to any other team than CERT-CC as a CERT. After the turn of the century, CMU relaxed its position, and the terms CERT and CSIRT are now used interchangeably.

The history of CERTs is linked to the existence of malware, especially computer worms and viruses. Whenever a new technology arrives, its misuse is not long in following. The first worm in the IBM VNET was covered up. Shortly after, a worm hit the Internet on 3 November 1988, when the so-called Morris Worm paralysed a good percentage of it. This led to the formation of the first computer emergency response team at Carnegie Mellon University under U.S. Government contract. With the massive growth in the use of information and communications technologies over the subsequent years, the now-generic term ‘CERT’/’CSIRT’ refers to an essential part of most large organisations’ structures. In many organisations the CERT evolves into a information security operations center.


An Unprecedented Look at Stuxnet, the World’s First Digital Weapon

Stuxnet is a malicious computer worm believed to be a jointly built AmericanIsraeli cyber weapon.[1] Although neither state has confirmed this openly,[2] anonymous US officials speaking to the Washington Post claimed the worm was developed during the Obama administration to sabotage Iran’s nuclear program with what would seem like a long series of unfortunate accidents.[3]

Stuxnet specifically targets PLCs, which allow the automation of electromechanical processes such as those used to control machinery on factory assembly lines, amusement rides, or centrifuges for separating nuclear material. Exploiting four zero-day flaws,[4] Stuxnet functions by targeting machines using the Microsoft Windows operating system and networks, then seeking out Siemens Step7 software. Stuxnet reportedly compromised Iranian PLCs, collecting information on industrial systems and causing the fast-spinning centrifuges to tear themselves apart.[5] Stuxnet’s design and architecture are not domain-specific and it could be tailored as a platform for attacking modern SCADA and PLC systems (e.g., in automobile or power plants), the majority of which reside in Europe, Japan and the US.[6] Stuxnet reportedly ruined almost one-fifth of Iran’s nuclear centrifuges.[7]

Stuxnet has three modules: a worm that executes all routines related to the main payload of the attack; a link file that automatically executes the propagated copies of the worm; and a rootkit component responsible for hiding all malicious files and processes, preventing detection of the presence of Stuxnet.[8]

Stuxnet is typically introduced to the target environment via an infected USB flash drive. The worm then propagates across the network, scanning for Siemens Step7 software on computers controlling a PLC. In the absence of either criterion, Stuxnet becomes dormant inside the computer. If both the conditions are fulfilled, Stuxnet introduces the infected rootkit onto the PLC and Step7 software, modifying the codes and giving unexpected commands to the PLC while returning a loop of normal operations system values feedback to the users.[9][10]

In 2015, Kaspersky Labs‘ research findings on another highly sophisticated espionage platform created by what they called the Equation Group, noted that the group had used two of the same zero-day attacks used by Stuxnet, before they were used in Stuxnet, and their use in both programs was similar. The researchers reported that “the similar type of usage of both exploits together in different computer worms, at around the same time, indicates that the EQUATION group and the Stuxnet developers are either the same or working closely together”.[11]:13

Continue reading “Stuxnet”

Chinese Hackers

Published on Nov 16, 2015

China has been accused of stealing the fifth-generation US F-35 fighter jet design. US defense and strategic affairs experts claim that the design for China’s FC-31 “Gyrfalcon” or J-31 was stolen during a hack into Lockheed Martin’s American defense network. Similarities between the two jets include tracking mirrors, a flat-faceted optical window, and bottom fuselage placement. Experts currently believe, however, that China’s F-35 cannot compete with the American version, and the craft will not reach operational capacity until 2024. Nik Zecevic and Jose Marcelino examine China’s reputation for knock off’s on The Lip News.…

A confused deputy attack

A confused deputy is a computer program that is innocently fooled by some other party into misusing its authority. It is a specific type of privilege escalation. In information security, the confused deputy problem is often cited as an example of why capability-based security is important, as capability systems protect against this whereas ACL-based systems do not.

Confidence trick based scams are based on gaining the trust of a victim in order for an attacker to use them as a confused deputy. For example in Salting, an attacker presents a victim with what appears to be a mineral-rich mine. In this case an attacker is using a victim’s greed to persuade them to perform an action that the victim would not normally do.

When checking out at a grocery store, the cashier will scan the barcode of each item to determine the total cost. A thief could replace barcodes on his items with those of cheaper items. In this attack the cashier is a confused deputy that is using seemingly valid barcodes to determine the total cost.

A cross-site request forgery (CSRF) is an example of a confused deputy attack that uses the web browser to perform sensitive actions against a web application. A common form of this attack occurs when a web application uses a cookie to authenticate all requests transmitted by a browser. Using JavaScript an attacker can force a browser into transmitting authenticated HTTP requests.

The Samy computer worm used Cross-Site Scripting (XSS) to turn the browser’s authenticated MySpace session into a confused deputy. Using XSS the worm forced the browser into posting an executable copy of the worm as a MySpace message which was then viewed and executed by friends of the infected user.

Clickjacking is an attack where the user acts as the confused deputy. In this attack a user thinks they are harmlessly browsing a website (an attacker-controlled website) but they are in fact tricked into performing sensitive actions on another website.[3]

An FTP bounce attack can allow an attacker to indirectly connect to TCP ports that the attacker’s machine has no access to, using a remote FTP server as the confused deputy.

Another example relates to personal firewall software. It can restrict internet access for specific applications. Some applications circumvent this by starting a browser with a specific URL. The browser has authority to open a network connection, even though the application does not. Firewall software can attempt to address this by prompting the user in cases where one program starts another which then accesses the network. However, the user frequently does not have sufficient information to determine whether such an access is legitimate—false positives are common, and there is a substantial risk that even sophisticated users will become habituated to clicking ‘OK’ to these prompts.[4]

Not every program that misuses authority is a confused deputy. Sometimes misuse of authority is simply a result of a program error. The confused deputy problem occurs when the designation of an object is passed from one program to another, and the associated permission changes unintentionally, without any explicit action by either party. It is insidious because neither party did anything explicit to change the authority.

education system

“The world economy no longer pays for what people know but for what they can do with what they know.”
– Andreas Schleicher, OECD deputy director for education

Sir Ken Robinson makes an entertaining and profoundly moving case for creating an education system that nurtures (rather than undermines) creativity.

East Asian nations continue to outperform others. South Korea tops the rankings, followed by Japan (2nd), Singapore (3rd) and Hong Kong (4th). All these countries’ education systems prize effort above inherited ‘smartness’, have clear learning outcomes and goalposts, and have a strong culture of accountability and engagement among a broad community of stakeholders.
Scandinavian countries, traditionally strong performers, are showing signs of losing their edge. Finland, the 2012 Index leader, has fallen to 5th place; and Sweden is down from 21st to 24th.
Notable improvers include Israel (up 12 places to 17th), Russia (up 7 places to 13th) and Poland (up four places to 10th).
Developing countries populate the lower half of the Index, with Indonesia again ranking last of the 40 nations covered, preceded by Mexico (39th) and Brazil (38th).

South Korea demonstrates the interplay between adult skills and the demands of employers. In South Korea young people score above average for numeracy and problem-solving skills, but are below average over the age of 30. According to Randall S Jones of the OECD, this skills decline is explained by many graduates “training for white-collar jobs that don’t exist”. This leads to a higher than average proportion failing to secure employment, and a quicker diminishing of their skills.

Developing countries must teach basic skills more effectively before they start to consider the wider skills agenda. There is little point in investing in pedagogies and technologies to foster 21st century skills, when the basics of numeracy and literacy aren’t in place.

Technology can provide new pathways into adult education, particularly in the developing world, but is no panacea. There is little evidence that technology alone helps individuals actually develop new skills.

Lifelong learning, even simple reading at home and number crunching at work, helps to slow the rate of age-related skill decline; but mainly for those who are highly skilled already. Teaching adults does very little to make up for a poor school system.

Making sure people are taught the right skills early in their childhood is much more effective than trying to improve skills in adulthood for people who were let down by their school system. But even when primary education is of a high quality, skills decline in adulthood if they are not used regularly.

In recent years it has become increasingly clear that basic reading, writing and arithmetic are not enough.
The importance of 21st century non-cognitive skills – broadly defined as abilities important for social interaction – is pronounced.

The OECD estimates that half of the economic growth in developed countries in the last decade came from improved skills.


Virtualization, in computing, is a term that refers to the various techniques, methods or approaches of creating a virtual (rather than actual) version of something, such as a virtual hardware platform, operating system (OS), storage device, or network resources.

Hardware virtualization or platform virtualization refers to the creation of a virtual machine that acts like a real computer with an operating system. Software executed on these virtual machines is separated from the underlying hardware resources. For example, a computer that is running Microsoft Windows may host a virtual machine that looks like a computer with the Ubuntu Linux operating system; Ubuntu-based software can be run on the virtual machine.[1][2]

In hardware virtualization, the host machine is the actual machine on which the virtualization takes place, and the guest machine is the virtual machine. The words host and guest are used to distinguish the software that runs on the physical machine from the software that runs on the virtual machine. The software or firmware that creates a virtual machine on the host hardware is called a hypervisor or Virtual Machine Manager.

Different types of hardware virtualization include:

  1. Full virtualization: Almost complete simulation of the actual hardware to allow software, which typically consists of a guest operating system, to run unmodified.
  2. Partial virtualization: Some but not all of the target environment is simulated. Some guest programs, therefore, may need modifications to run in this virtual environment.
  3. Paravirtualization: A hardware environment is not simulated; however, the guest programs are executed in their own isolated domains, as if they are running on a separate system. Guest programs need to be specifically modified to run in this environment.

Hardware-assisted virtualization is a way of improving the efficiency of hardware virtualization. It involves employing specially designed CPUs and hardware components that help improve the performance of a guest environment.

Hardware virtualization can be viewed as part of an overall trend in enterprise IT that includes autonomic computing, a scenario in which the IT environment will be able to manage itself based on perceived activity, and utility computing, in which computer processing power is seen as a utility that clients can pay for only as needed. The usual goal of virtualization is to centralize administrative tasks while improving scalability and overall hardware-resource utilization. With virtualization, several operating systems can be run in parallel on a single central processing unit (CPU). This parallelism tends to reduce overhead costs and differs from multitasking, which involves running several programs on the same OS. Using virtualization, an enterprise can better manage updates and rapid changes to the operating system and applications without disrupting the user. “Ultimately, virtualization dramatically improves the efficiency and availability of resources and applications in an organization. Instead of relying on the old model of “one server, one application” that leads to under utilized resource, virtual resources are dynamically applied to meet business needs without any excess fat” (ConsonusTech).

Hardware virtualization is not the same as hardware emulation. In hardware emulation, a piece of hardware imitates another, while in hardware virtualization, a hypervisor (a piece of software) imitates a particular piece of computer hardware or the entire computer. Furthermore, a hypervisor is not the same as an emulator; both are computer programs that imitate hardware, but their domain of use in language differs.

VirtualBox is a general-purpose full virtualizer for x86 hardware, targeted at server, desktop and embedded use.

For a thorough introduction to virtualization and VirtualBox, please refer to the online version of the VirtualBox User Manual’s first chapter.

Why does HP recommend that I keep Hardware Virtualization off?

There are several attack vectors from bad drivers that can utilize VT extensions to do potentially bad things. that’s why the setting is usually in the “security” section of your BIOS UI.

additionally the smaller your instruction set, the more efficient the CPU runs at a very very low level (hence last decades interest in RISC chips). having it disabled allows the CPU to cache fewer instructions and search the cache faster.

So is there a security risk to enabling AMD-V? – Rocket Hazmat Feb 1 at 16:21
yes. Installing drivers and other very-low-level software is always risky, so its probably no more risky that grabbing a driver off a non-official download site. the big difference is that a blue-pill exploit could allow a guest to affect the host and vice-verse, which should really never be true. – Frank Thomas Feb 1 at 16:37
I disagree saying there is a security risk by enabling AMD-V. Doing a quick search on “AMD-V security” results in NO results on the first page about a security vulnerability that says a great deal. – Ramhound Feb 1 at 16:46
So, it’s off by default, because there are rootkits that pretend to by hypervisors? Guess I just gotta be careful what I download! 🙂 – Rocket Hazmat Feb 1 at 16:49

Blue Pill is the codename for a rootkit based on x86 virtualization. Blue Pill originally required AMD-V (Pacifica) virtualization support, but was later ported to support Intel VT-x (Vanderpool) as well. It was designed by Joanna Rutkowska and originally demonstrated at the Black Hat Briefings on August 3, 2006, with a reference implementation for the Microsoft Windows Vista kernel.

Digital Forensics

What is odessa?

It’s an acronym for “Open Digital Evidence Search and Seizure Architecture”
The intent of this project is to provide a completely open and extensible suite of tools for performing digital evidence analysis as well as a means of generating a usable report detailing the analysis and any findings. The odessa tool suite currently represents more than 7 man years of labor, and consists of 3 highly modular cross-platform tools for the acquisition, analysis, and documentation of digital evidence.

In addition to the odessa tool suite, the project hosts other applications and information related to digital forensics. At this time, the list of additional tools includes a set of whitepapers and utilities authored by Keith J. Jones including Galleta, a tool for analyzing Internet Explorer cookies, Pasco, a tool for analyzing the Microsoft Windows index.dat file, and Rifiuti, a tool for investigating the Microsoft Windows recycle bin info2 file.

CAINE (Computer Aided INvestigative Environment) is an Italian GNU/Linux live distribution created as a project of Digital Forensics
Currently the project manager is Nanni Bassetti.
CAINE offers a complete forensic environment that is organized to integrate existing software tools as software modules and to provide a friendly graphical interface.
The main design objectives that CAINE aims to guarantee are the following:

  • an interoperable environment that supports the digital investigator during the four phases of the digital investigation
  • a user friendly graphical interface
  • a semi-automated compilation of the final report

We recommend you to read the page on the CAINE policies carefully.
CAINE represents fully the spirit of the Open Source philosophy, because the project is completely open, everyone could take the legacy of the previous developer or project manager. The distro is open source, the Windows side (Wintaylor) is open source and, the last but not the least, the distro is installable, so giving the opportunity to rebuild it in a new brand version, so giving a long life to this project ….