Billion-Dollar Hacker Gang Now Using Google Services to Control Its Banking Malware

Carbanak – One of the most successful cybercriminal gangs ever that’s known for the theft of one billion dollars from over 100 banks across 30 countries back in 2015 – is back with a BANG!

The Carbanak cyber gang has been found abusing various Google services to issue command and control (C&C) communications for monitoring and controlling the machines of unsuspecting malware victims.
<!–

Carbanak – One of the most successful cybercriminal gangs ever that's known for the theft of one billion dollars from over 100 banks across 30 countries back in 2015 – is back with a BANG! The Carbanak cyber gang has been found abusing various Google services to issue command and control (C&C) communications for monitoring and controlling the machines of unsuspecting malware victims. <!--

THN Deal — Become A Certified Ethical Hacker With This Online Training Course

Hacking is not a trivial process, but it does not take too long to learn. If you want to learn Ethical Hacking and Penetration testing, you are at right place.

We frequently receive emails from our readers on how to learn hacking, how to become an eth…

Hacking is not a trivial process, but it does not take too long to learn. If you want to learn Ethical Hacking and Penetration testing, you are at right place. We frequently receive emails from our readers on how to learn hacking, how to become an ethical hacker, how to break into computers, how to penetrate networks like a professional, how to secure computer systems and networks, and so on.

You Can Crash Anyone’s iPhone Or iPad With A Simple Emoji Text Message

A newly discovered bug in Apple’s iOS mobile operating system is being exploited in a prank that lets anyone crash your iPhone or iPad by just sending an emoji-filled iMessage, according to several reports.

YouTube star EverythingApplePro published a …

A newly discovered bug in Apple's iOS mobile operating system is being exploited in a prank that lets anyone crash your iPhone or iPad by just sending an emoji-filled iMessage, according to several reports. YouTube star EverythingApplePro published a video highlighting a sequence of characters that temporarily freeze and restart an iPhone, which people can send to their iPhone buddies to

Newly Discovered Mac Malware with Ancient Code Spying on Biotech Firms

Security researchers have discovered a rare piece of Mac-based espionage malware that relies on outdated coding practices but has been used in some previous real-world attacks to spy on biomedical research center computers.

Dubbed Fruitfly, the malwar…

Security researchers have discovered a rare piece of Mac-based espionage malware that relies on outdated coding practices but has been used in some previous real-world attacks to spy on biomedical research center computers. Dubbed Fruitfly, the malware has remained undetected for years on macOS systems despite using unsophisticated and "antiquated code." Infosec firm Malwarebytes discovered

Upcoming European Internet of Things rules, time for the industry to show the right route!

The Internet of Things (IoT) is getting regulated through the draft European ePrivacy privacy regulation and the revised database and product liability directives, but is this good news? I am generally of the opinion that “no rules are better than bad rules“. Regulations can help to foster a market, but if they just create additional …

Continue reading »

The Internet of Things (IoT) is getting regulated through the draft European ePrivacy privacy regulation and the revised database and product liability directives, but is this good news?

I am generally of the opinion that

no rules are better than bad rules“.

Regulations can help to foster a market, but if they just create additional obligations, they are likely to damage it.. There is no doubt that the current scenario of uncertainty as to the applicable rights and obligations might be great for lawyers, but companies that need to invest in the IoT would rather have a scenario which is “crystal clear“. The challenge is to see whether European regulators will be able to identify the right balance between regulating and over regulating.

The broad approach of the draft European ePrivacy regulation to the Internet of Things

I had anticipated that the European Commission is starting a review process of the ePrivacy Regulation which complements the European Data Protection Regulation governing the processing of personal data on electronic communications.

The proposal for the ePrivacy Regulation has now been published and when it comes to Internet of Things technologies the principle is very broad:

In order to ensure full protection of the rights to privacy and confidentiality of communications, and to promote a trusted and secure Internet of Things in the digital single market, it is necessary to clarify thatthis Regulation should apply to the transmission of machine-to-machine communications. Therefore, the principle of confidentiality enshrined in this Regulation should also apply to the transmission of machine-to-machine communications.

The wording of the draft regulation is not fully clear as it does not specify whether it refers only to M2M communications containing personal data. This might be implied given that the regulation is meant to govern privacy related issues, but it is not expressly mentioned. A more literal interpretation of the provision would require at least to comply with the principle of confidentiality set out in the draft ePrivacy Regulation under which

Electronic communications data shall be confidential. Any interference with electronic communications data, such as by listening, tapping, storing, monitoring, scanning or other kinds of interception, surveillance or processing of electronic communications data, by persons other than the end-users, shall be prohibited, except when permitted by this Regulation.

The extension of the principle of confidentiality to IoT communications means that, in case of its breach, fines up to € 20 million, or in the case of an undertaking, up to 4 % of the total worldwide annual turnover of the preceding financial year (whichever is higher) will be applicable!

The above means also that in case of Industrial Internet of Things projects where usually privacy issues are not the main priority since only machine related data is processed, data protection compliance obligations will become relevant. This is also because the draft regulation refers to the applicability of all of its contents and therefore for instance also of the need to put in place a privacy by design approach.

The project on building a European data economy is the right route?

The European Commission published a communication announcing its plan to build a “European data economy”. Such plan includes not only the upcoming European General Data Protection Regulation and the above mentioned draft ePrivacy Regulation, but also the following principles:

1. The removal of unjustified restrictions to the free movement of data

This would be achieved through for instance the challenging of  “data location restrictions” provided by the laws of EU Member States such as those requiring to keep data in servers in a specific country.

These restrictions should be replaced by security obligations as those provided by the NID Directive and providing for a general principle of free movement of data within the EU.

2. The setting of rules on access to and transmission to IoT data generated by Industry 4.0 machines or processes

The goal of the European Commission is to identify an approach that shall

  • Improve access to anonymous machine-generated data, through open data rules enabling public authorities to access data considered to be of public interest. As any regulation on open data, the risk is that it will create a disincentive on private companies to invest in sectors if they know that the data generated by them will be shared with third parties;
  • Facilitate and incentivise the sharing of data which the European Commission is considering to achieve (i) setting default clausesand prohibiting unfair terms which considerably deviate from them, (ii) creating technical standards to trace and share data (e.g. standards for APIs) and (iii) reforming the European Database Directive. These initiatives are interesting, but if the technical standards are different from those set by the market, they will lead to additional costs for market players. Likewise, default clauses might represent a limitation for investments in Europe if they lead to additional risks/liabilities for operators;
  • Protect investments and assets, by means of the introduction of a so called “data producer’s right” which would clarify the ownership of the user of the device on anonymous data generated by machines. The issue with such right is how it will “live” with the existing intellectual property rights. If it results in just an additional layer of rights and consequential consents necessary for the exploitation of data, it risks to fail;
  • Avoid disclosure of confidential data; and
  • Minimise lock-in effects where de fact the manufacturers of IoT machines become the owners of generated data since they have the control on it. This would be achieved obliging manufacturers, service providers or other parties, to provide access to the data they hold against remuneration after anonymisation. Such provision sounds like a compulsory license that would entail the same issues mentioned above in relation to open data rules.

3. The amending of rules on product liability and data portability and interoperability

The rules on the liability for any damages resulting from a fault in a connected IoT device or a robot need to be amended. This has to occur through changes to the principles set forth in the EU Product Liability Directive since strict liability rules are hard to apply to Internet of Things technologies where a malfunctioning can arise from different connected sources.

According to the European Commission, a possible approach would be to either (i) allocate more responsibilities on market players that generate more risks or have more control on them or (ii) introduce mandatory or voluntary insurance schemes.

Also, rules on the portability of the data generated by Internet of Things devices which shall be extended to non-personal data and interoperability of such data might be implemented through recommended contractual terms or technical standards facilitating the switch from different suppliers.

Such initiatives sound interesting, but if operators need to bear additional costs and risks to market their technologies in Europe, it is likely that customers are going to bear such cost, leading to a potential damage, rather than advantage for them.

The European Consultation on the new IoT rules

Given the very large number of open issues, the European Commission launched a consultation on the topic that will expire on 26 April 2017. This seems a good opportunity to identify the right approach on an issue which might have a relevant impact on the future of companies investing in Europe.

If you found this article interesting, please share it on your favourite social media!

@GiulioCoraggio

Smile! Hackers Can Remotely Access Your Samsung SmartCam Security Cameras

It’s not necessary to break into your computer or smartphone to spy on you. Today all devices in our home are becoming more connected to networks than ever to make our lives easy.

But what’s worrisome is that these connected devices can be turned agai…

It's not necessary to break into your computer or smartphone to spy on you. Today all devices in our home are becoming more connected to networks than ever to make our lives easy. But what's worrisome is that these connected devices can be turned against us, anytime, due to lack of stringent security measures and insecure encryption mechanisms implemented in these Internet of Things (IoTs)

The Tele2/Watson case: What are the key takeaways? …and what is to become of the new Investigatory Powers Act?

The CJEU’s recent decision in the Tele2/Watson case contains very interesting guidance on the rules around the retention of communications data and the safeguards that must be in place to protect it. It may also call the viability of the new Investigatory Powers Act into question.

The key issue in the case was whether legislation in Sweden and the UK, which imposed an obligation on public communications providers to retain traffic and location data, was compatible with EU law. The UK legislation (i.e. s.1 of the now expired DRIPA 2014) required public telecommunications operators to retain all such communications data for a maximum of 12 months where required to by the Secretary of State.

The CJEU gave guidance on the aspects of national legislation that would be deemed unlawful under EU law. Here are the most important takeaways from the judgment:

1.         The intrusiveness of traffic and location data.

The CJEU held that traffic and location data was liable to allow “very precise conclusions” to be drawn about the private lives of the persons, including their everyday habits, their permanent or temporary residence, daily movements, the activities carried out, their social relationships and social environments, which in part, can establish a profile of the person concerned.

The Court emphasized that traffic data was “no less sensitive… than the actual content of communications” and that the interference posed by such legislation was thus “particularly serious”.

2.         The purpose for retention must be limited to fighting serious crime

The CJEU made clear that only the objective of fighting serious crime is capable of justifying such a serious interference. No other objectives are permissible.

3.         Retention must be targeted to what is “strictly necessary” to fight serious crime

The CJEU stated that even the objective of fighting serious crime cannot itself justify the “general and indiscriminate” retention of data. However, the Court made clear that “targeted” retention of data for the purpose of fighting serious crime was justified, provided that such retention of data is limited – with respect to the categories of data to be retained, the means of communications affected, the persons concerned and the retention period adopted – to what is “strictly necessary“.

The Court stated that as a general rule, access can only be granted to data about individuals actually suspected of or implicated in a serious crime. However, in particular situations, like terrorism investigations, access to the data of others might be granted where there is objective evidence to deduce that it might make an “effective contribution” to combating such activities.

4.         Access to the data must be subject to prior review by a court or independent authority

The CJEU further stated that it is “essential” that access to retained data should, except in cases of clear urgency, be subject to prior review by either a court or an independent body.

5.         Data subjects must be informed as soon as possible

The CJEU commented that the fact that the data is retained without the users being informed of the fact was likely to cause people to feel that “their private lives were the subject of constant surveillance“.

To counteract this, the Court stated that the national authorities (to whom access to retained data has been granted), must notify the persons affected as soon as such notice is no longer liable to jeopardize the investigation. This would enable individuals to exercise their right to a legal remedy where their rights have been infringed.

6.         Retained data must stay within the EU

Given the quantity of retained data, the sensitivity of the data and the risk of unlawful access to it, the CJEU held that national legislation must make provision for the data to be retained within the EU and for the irreversible destruction of the data at the end of the retention period.

Although the CJEU gave the guidance above, it did not make findings in relation to the Swedish and UK legislation in question. It is now down to the domestic courts to rule on the actual lawfulness of the specific legislation – though, given the guidance above, the inevitable answer must be that DRIPA is incompatible with EU law. 


… so what is to become of the new Investigatory Powers Act 2016?

With DRIPA 2014 having already expired at the end of 2016, you’d be forgiven in thinking that the guidance in this case is now all moot. However, this judgment now has potentially major ramifications on the Investigatory Powers Act 2016 (IPA), the new UK legislation that came into force on 30 December 2016 to replace DRIPA.

It is clear that many aspects of the new IPA still fall short of satisfying the CJEU’s criterion above. Here are some of the reasons:

  • The purposes of retention are not limited to “fighting serious crime”: The warrants and notices under the IPA can be granted on various non-crime related grounds, including to safeguard the economic well-being of the UK, in the interests of public safety, public health, to collect taxes or other government levies, to prevent death, injury or damage to health, to assist in the identification of a deceased person, for the regulation of financial markets, financial stability, and so on (see s.61(7)). This is much too wide a range of purposes, according to the CJEU judgment.   
  • Data retention is not targeted to what is “strictly necessary”: Firstly, there are several categories of “bulk warrants” that can be issued under the IPA (e.g. bulk interception warrants, bulk acquisition warrants, bulk personal dataset warrants). These are inherently not targeted in nature, and do not need to be limited to particular persons/times/premises. For example, “all communications transmitted on a particular route or cable, or carried by a particular telecommunications operator could, in principle, be lawfully authorised” (see this Code of Practice, para 6.6). Secondly, even in respect of the targeted warrants, there is no express requirement that such warrants be limited to that which is strictly necessary for the permitted purposes. 
  • Prior independent review not required in all cases: Although many of the orders are subject to prior review by a Judicial Commissioner, there is no need for such review for “Authorisations for Obtaining Communications Data” under Part 3 of the Act. These empower numerous public authorities to obtain communications data directly from any person, telecommunications system or operator without need for independent review.  
  • There is no provision for informing affected individuals of any orders made
  • There is no provision for keeping the retained data within the EU

Given the above, the Tele2/Watson judgment is likely to threaten the viability of many parts of the IPA, leaving the Act in a further precarious and uncertain state. The IPA was already a controversial piece of legislation in the UK and as a result of this judgment; it is now even more exposed to successful legal challenge. 

The UK will need to consider carefully what amendments, if any, it will make to the IPA to bring it into conformity with EU law. In the meantime, electronic communications providers can expect even longer delays in the implementation of these new rules.

 

The CJEU's recent decision in the Tele2/Watson case contains very interesting guidance on the rules around the retention of communications data and the safeguards that must be in place to protect it. It may also call the viability of the new Investigatory Powers Act into question.

The key issue in the case was whether legislation in Sweden and the UK, which imposed an obligation on public communications providers to retain traffic and location data, was compatible with EU law. The UK legislation (i.e. s.1 of the now expired DRIPA 2014) required public telecommunications operators to retain all such communications data for a maximum of 12 months where required to by the Secretary of State.

The CJEU gave guidance on the aspects of national legislation that would be deemed unlawful under EU law. Here are the most important takeaways from the judgment:

1.         The intrusiveness of traffic and location data.

The CJEU held that traffic and location data was liable to allow "very precise conclusions" to be drawn about the private lives of the persons, including their everyday habits, their permanent or temporary residence, daily movements, the activities carried out, their social relationships and social environments, which in part, can establish a profile of the person concerned.

The Court emphasized that traffic data was "no less sensitive… than the actual content of communications" and that the interference posed by such legislation was thus "particularly serious".

2.         The purpose for retention must be limited to fighting serious crime

The CJEU made clear that only the objective of fighting serious crime is capable of justifying such a serious interference. No other objectives are permissible.

3.         Retention must be targeted to what is "strictly necessary" to fight serious crime

The CJEU stated that even the objective of fighting serious crime cannot itself justify the "general and indiscriminate" retention of data. However, the Court made clear that "targeted" retention of data for the purpose of fighting serious crime was justified, provided that such retention of data is limited – with respect to the categories of data to be retained, the means of communications affected, the persons concerned and the retention period adopted – to what is "strictly necessary".

The Court stated that as a general rule, access can only be granted to data about individuals actually suspected of or implicated in a serious crime. However, in particular situations, like terrorism investigations, access to the data of others might be granted where there is objective evidence to deduce that it might make an "effective contribution" to combating such activities.

4.         Access to the data must be subject to prior review by a court or independent authority

The CJEU further stated that it is "essential" that access to retained data should, except in cases of clear urgency, be subject to prior review by either a court or an independent body.

5.         Data subjects must be informed as soon as possible

The CJEU commented that the fact that the data is retained without the users being informed of the fact was likely to cause people to feel that "their private lives were the subject of constant surveillance".

To counteract this, the Court stated that the national authorities (to whom access to retained data has been granted), must notify the persons affected as soon as such notice is no longer liable to jeopardize the investigation. This would enable individuals to exercise their right to a legal remedy where their rights have been infringed.

6.         Retained data must stay within the EU

Given the quantity of retained data, the sensitivity of the data and the risk of unlawful access to it, the CJEU held that national legislation must make provision for the data to be retained within the EU and for the irreversible destruction of the data at the end of the retention period.

Although the CJEU gave the guidance above, it did not make findings in relation to the Swedish and UK legislation in question. It is now down to the domestic courts to rule on the actual lawfulness of the specific legislation – though, given the guidance above, the inevitable answer must be that DRIPA is incompatible with EU law. 


… so what is to become of the new Investigatory Powers Act 2016?

With DRIPA 2014 having already expired at the end of 2016, you'd be forgiven in thinking that the guidance in this case is now all moot. However, this judgment now has potentially major ramifications on the Investigatory Powers Act 2016 (IPA), the new UK legislation that came into force on 30 December 2016 to replace DRIPA.

It is clear that many aspects of the new IPA still fall short of satisfying the CJEU's criterion above. Here are some of the reasons:

  • The purposes of retention are not limited to "fighting serious crime": The warrants and notices under the IPA can be granted on various non-crime related grounds, including to safeguard the economic well-being of the UK, in the interests of public safety, public health, to collect taxes or other government levies, to prevent death, injury or damage to health, to assist in the identification of a deceased person, for the regulation of financial markets, financial stability, and so on (see s.61(7)). This is much too wide a range of purposes, according to the CJEU judgment.   
  • Data retention is not targeted to what is "strictly necessary": Firstly, there are several categories of "bulk warrants" that can be issued under the IPA (e.g. bulk interception warrants, bulk acquisition warrants, bulk personal dataset warrants). These are inherently not targeted in nature, and do not need to be limited to particular persons/times/premises. For example, "all communications transmitted on a particular route or cable, or carried by a particular telecommunications operator could, in principle, be lawfully authorised" (see this Code of Practice, para 6.6). Secondly, even in respect of the targeted warrants, there is no express requirement that such warrants be limited to that which is strictly necessary for the permitted purposes. 
  • Prior independent review not required in all cases: Although many of the orders are subject to prior review by a Judicial Commissioner, there is no need for such review for "Authorisations for Obtaining Communications Data" under Part 3 of the Act. These empower numerous public authorities to obtain communications data directly from any person, telecommunications system or operator without need for independent review.  
  • There is no provision for informing affected individuals of any orders made
  • There is no provision for keeping the retained data within the EU

Given the above, the Tele2/Watson judgment is likely to threaten the viability of many parts of the IPA, leaving the Act in a further precarious and uncertain state. The IPA was already a controversial piece of legislation in the UK and as a result of this judgment; it is now even more exposed to successful legal challenge. 

The UK will need to consider carefully what amendments, if any, it will make to the IPA to bring it into conformity with EU law. In the meantime, electronic communications providers can expect even longer delays in the implementation of these new rules.

 

Bella Thorne Has Nothing But Love For Her Ex Tyler Posey

Bella Thorne is rising above the post-breakup drama with her ex Tyler Posey.
The 19-year-old actress wrote a heartwarming tweet about her former flame…
Read more: Celebrities, Online Privacy, Gen Z Lab, Bella Thorne, Tyler Pos…

Bella Thorne is rising above the post-breakup drama with her ex Tyler Posey.
The 19-year-old actress wrote a heartwarming tweet about her former flame...

Read more: Celebrities, Online Privacy, Gen Z Lab, Bella Thorne, Tyler Posey, Entertainment News

How EaseUS Partition Master Can Easily Manage Your Hard Disk

If you want to get the most out of your computer, you need to get the most out of your hard drive, where all your data is stored.

Today hard drives are larger than ever, so it makes sense for you to partition your hard disk to effectively use all of i…

If you want to get the most out of your computer, you need to get the most out of your hard drive, where all your data is stored. Today hard drives are larger than ever, so it makes sense for you to partition your hard disk to effectively use all of its space and manage all your important information. Partitioning is also useful if you intend to install and use more than one operating system

How To Stop Larry From Hacking Your WiFi in 2017

It’s 2017, and we’re not any further along with Wi-Fi security than we were 10 years ago. There are Intrusion Detection Systems and 2nd generation antivirus apps to protect us from some vulnerabilities but the simple fact that some people and businesses still don’t set their network up well in the first place.

Installing WiFi is like running Ethernet to your parking lot. It’s a cliche thing

It’s 2017, and we’re not any further along with Wi-Fi security than we were 10 years ago. There are Intrusion Detection Systems and 2nd generation antivirus apps to protect us from some vulnerabilities but the simple fact that some people and businesses still don’t set their network up well in the first place. Installing WiFi is like running Ethernet to your parking lot. It’s a cliche thing

How China is poised for marine fisheries reform

China has introduced an unprecedented policy platform for stewarding its fisheries and other marine resources. In order to achieve a true paradigm shift a team of international scientists from within and outside of China recommend major institutional r…

China has introduced an unprecedented policy platform for stewarding its fisheries and other marine resources. In order to achieve a true paradigm shift a team of international scientists from within and outside of China recommend major institutional reform.

Court Documents Reveal How Feds Spied On Connected Cars For 15 Years

It’s not always necessary to break into your computer or smartphone to spy on you. Today all are day-to-day devices are becoming more connected to networks than ever to add convenience and ease to daily activities.

But here’s what we forget: These con…

It's not always necessary to break into your computer or smartphone to spy on you. Today all are day-to-day devices are becoming more connected to networks than ever to add convenience and ease to daily activities. But here's what we forget: These connected devices can be turned against us because we are giving companies, hackers, and law enforcement a large number of entry points to break into

The new e-Privacy Regulation – What you need to know

Following on from my previous post about the leaked draft of the new e-Privacy Regulation, the European Commission has now published its official draft (press release here; draft legislation here). So what’s changed since the leaked draft, and what are the key takeaway points to know?

What’s not changed?

The following elements have not materially changed from the Commission’s earlier leaked draft at the end of 2016:

1. Extra-territoriality and 4% fines: The proposed Regulation applies to entities anywhere in the world who provide publicly-available “electronic communications services” to, or gather data from the devices of, users in the European Union. Breaches of the new e-Privacy Regulation can attract fines of up to 4% of annual worldwide turnover, just like the GDPR.

2. Application to OTT, IOT, M2M and lots of other acronyms: The Regulation applies to traditional telcos and ISPs, but also to Over The Top (OTT) providers too – i.e. providers of messaging apps, e-mail platforms, VOIP services and the like. In addition, anyone using cookies or similar tracking technologies (like device fingerprinting) will also be caught by the new rules. IOT and machine-to-machine communications also fall within the scope of some of its rules.

3. New rules for communications data: The proposed Regulation introduces new rules for processing communications content (i.e. what was said) and communications metadata (i.e. who said it, when, where, and other related information about the communication). The term ‘metadata’ replaces the current definition of ‘traffic data’ under the current e-Privacy Directive. The Regulation allows slightly wider uses of content and metadata than is the case under current law. My simple graphic summarising the new communications data rules is available here.

4. E-marketing rules: The official draft, like the leaked version, does not materially change today’s e-marketing rules. E-marketing still requires opt-in, save where an individual’s contact details have been obtained in the context of a sale – in which case opt-out is possible. There are, however, slight new transparency requirements for direct marketing calls as compared with current law.

5. Exemption for analytics cookies: Like the leaked draft, the Commission’s proposal retains an exemption from the cookie consent requirement for analytics. However, the exemption applies only for first party analytics, not third party analytics – so websites and apps using third party analytics platforms like Google Analytics etc. will still need consent (even if, for the techies amongst you, the cookie is technically served from a first party domain – third party here refers to the provider of the analytics service, not the domain from which the cookie is served).

What has changed?

If the above points haven’t materially changed from the leaked draft, then what has?

1. No blocking of cookies by default: Cookies generally still require consent under the Commission’s official draft, and the Commission still wants providers of browsers and similar software to provide their users with cookie and tracking controls. However, rather than insisting browser providers block cookies by default, the Commission has now struck a more moderated tone – instead requiring that, as part of the browser software set-up, users must be provided with cookie consent choices. The overall aim seems to be to move the consent requirement away from websites (and their cookie banners) to the browser providers – in a move that could spell the end for cookie banners.

2. Effective date: The leaked draft suggested that, once adopted, the Regulation would have a 6 month lead-in period. The Commission’s official draft instead says that it will apply from 25 May 2018 – the date that the GDPR also comes into effect.

The scope of the e-Privacy Regulation is very wide, and will broadly apply to any business that provides any form of online communication service, that utilises online tracking technologies, or that engages in electronic direct marketing – in today’s digital age, just about everyone.

The Commission’s ambitious deadline of getting this all done, dusted and in force by 25 May therefore seems very ambitious – remember that it took over four years to get the GDPR agreed. While the e-Privacy Regulation is a somewhat simpler document, many of its provisions (especially around communications data and tracking technologies) will be highly contentious for both industry and civil liberties groups, and a lot will inevitably evolve as the draft law passes through the legislative process.

Nevertheless, if the Commission does meet its aim, then May 2018 is set to be something of a regulatory “Big Bang” for data processing businesses – in a single month, they will need to ensure they have everything necessary in place to comply with the new GDPR, NIS Directive and e-Privacy Regulation! Best get planning now…

Following on from my previous post about the leaked draft of the new e-Privacy Regulation, the European Commission has now published its official draft (press release here; draft legislation here). So what’s changed since the leaked draft, and what are the key takeaway points to know?


What’s not changed?


The following elements have not materially changed from the Commission’s earlier leaked draft at the end of 2016:


1. Extra-territoriality and 4% fines: The proposed Regulation applies to entities anywhere in the world who provide publicly-available "electronic communications services" to, or gather data from the devices of, users in the European Union. Breaches of the new e-Privacy Regulation can attract fines of up to 4% of annual worldwide turnover, just like the GDPR.


2. Application to OTT, IOT, M2M and lots of other acronyms: The Regulation applies to traditional telcos and ISPs, but also to Over The Top (OTT) providers too - i.e. providers of messaging apps, e-mail platforms, VOIP services and the like. In addition, anyone using cookies or similar tracking technologies (like device fingerprinting) will also be caught by the new rules. IOT and machine-to-machine communications also fall within the scope of some of its rules.


3. New rules for communications data: The proposed Regulation introduces new rules for processing communications content (i.e. what was said) and communications metadata (i.e. who said it, when, where, and other related information about the communication). The term ‘metadata’ replaces the current definition of ‘traffic data’ under the current e-Privacy Directive. The Regulation allows slightly wider uses of content and metadata than is the case under current law. My simple graphic summarising the new communications data rules is available here.


4. E-marketing rules: The official draft, like the leaked version, does not materially change today’s e-marketing rules. E-marketing still requires opt-in, save where an individual’s contact details have been obtained in the context of a sale - in which case opt-out is possible. There are, however, slight new transparency requirements for direct marketing calls as compared with current law.


5. Exemption for analytics cookies: Like the leaked draft, the Commission’s proposal retains an exemption from the cookie consent requirement for analytics. However, the exemption applies only for first party analytics, not third party analytics - so websites and apps using third party analytics platforms like Google Analytics etc. will still need consent (even if, for the techies amongst you, the cookie is technically served from a first party domain - third party here refers to the provider of the analytics service, not the domain from which the cookie is served).


What has changed?


If the above points haven't materially changed from the leaked draft, then what has?


1. No blocking of cookies by default: Cookies generally still require consent under the Commission’s official draft, and the Commission still wants providers of browsers and similar software to provide their users with cookie and tracking controls. However, rather than insisting browser providers block cookies by default, the Commission has now struck a more moderated tone - instead requiring that, as part of the browser software set-up, users must be provided with cookie consent choices. The overall aim seems to be to move the consent requirement away from websites (and their cookie banners) to the browser providers - in a move that could spell the end for cookie banners.


2. Effective date: The leaked draft suggested that, once adopted, the Regulation would have a 6 month lead-in period. The Commission’s official draft instead says that it will apply from 25 May 2018 - the date that the GDPR also comes into effect.

The scope of the e-Privacy Regulation is very wide, and will broadly apply to any business that provides any form of online communication service, that utilises online tracking technologies, or that engages in electronic direct marketing - in today’s digital age, just about everyone.

The Commission’s ambitious deadline of getting this all done, dusted and in force by 25 May therefore seems very ambitious - remember that it took over four years to get the GDPR agreed. While the e-Privacy Regulation is a somewhat simpler document, many of its provisions (especially around communications data and tracking technologies) will be highly contentious for both industry and civil liberties groups, and a lot will inevitably evolve as the draft law passes through the legislative process.

Nevertheless, if the Commission does meet its aim, then May 2018 is set to be something of a regulatory “Big Bang” for data processing businesses - in a single month, they will need to ensure they have everything necessary in place to comply with the new GDPR, NIS Directive and e-Privacy Regulation! Best get planning now...

Student Faces 10 Years In Prison For Creating And Selling Limitless Keylogger

A 21-year-old former Langley High School student, who won a Programmer of the Year Award in high school, pleaded guilty on Friday to charges of developing and selling custom key-logging malware that infected thousands of victims.

Zachary Shames from V…

A 21-year-old former Langley High School student, who won a Programmer of the Year Award in high school, pleaded guilty on Friday to charges of developing and selling custom key-logging malware that infected thousands of victims. Zachary Shames from Virginia pleaded guilty in a federal district court and now faces a maximum penalty of up to 10 years in prison for his past deeds. Shames was

Explained — What’s Up With the WhatsApp ‘Backdoor’ Story? Feature or Bug!

What is a backdoor?

By definition: “Backdoor is a feature or defect of a computer system that allows surreptitious unauthorized access to data, ” either the backdoor is in encryption algorithm, a server or in an implementation, and doesn’t matter whet…

What is a backdoor? By definition: "Backdoor is a feature or defect of a computer system that allows surreptitious unauthorized access to data, " either the backdoor is in encryption algorithm, a server or in an implementation, and doesn't matter whether it has previously been used or not. Yesterday, we published a story based on findings reported by security researcher Tobias Boelter that

Sophos Endpoint Protection scores high in SE Labs Q4 2016 testing

SE Labs has just released its Q4 2016 testing results, and we’re pleased to report that Sophos Endpoint Protection scored high. The results are a testament to Sophos Lab’s diligence in protecting customers against real-time malware threats that are constantly evolving. This week Sophos’ Bill Brenner and Matt Cooke conducted a Facebook Live chat to […]

Sophos EndpointSE Labs has just released its Q4 2016 testing results, and we’re pleased to report that Sophos Endpoint Protection scored high. The results are a testament to Sophos Lab’s diligence in protecting customers against real-time malware threats that are constantly evolving.

This week Sophos’ Bill Brenner and Matt Cooke conducted a Facebook Live chat to discuss the results.

 

The charts below, taken from the report, show how Sophos fared in the various SMB and Enterprise categories.

SMB category:

smb1

smb2

smb3

Enterprise category:

enterprise1

enterprise2

enterprise3

Find out how our next-gen protection can secure your business at Sophos.com.


Filed under: Corporate Tagged: Q42016 testing, SE Labs, Sophos Endpoint Protection, SophosLabs

RWC 2017 – Secure MPC at Google

This talk was given by Ben Kreuter and its focus was on the apparent disparity between what we research in academia versus what is required in the real world, specifically in the field of multi-party computation (MPC). MPC is the idea of allowing multi…

This talk was given by Ben Kreuter and its focus was on the apparent disparity between what we research in academia versus what is required in the real world, specifically in the field of multi-party computation (MPC). MPC is the idea of allowing multiple parties to compute some function on their combined input without any party revealing anything about their input to the other parties (other than what can be learnt from the output alone).

While significant work has been done on making MPC efficient in practice (for example, the work of Yehuda Lindell et al. on high-throughput MPC which was presented by Lindell in the preceding talk), the focus tends to be on generic protocols (e.g. general logic circuits) with strong security guarantees (e.g. malicious security), which invariably leads to large computational overhead. In practice, we usually require only specific protocols, which can therefore be optimised, and comparatively weak security guarantees.

In the real world, network cost is the salient factor, rather than the speed of the protocol, since the parties who are involved in a computation often have to use networks (such as the Internet) which are being used by many other people at the same time and cannot make the best use of the network's full capabilities. The MPC at Google is about computation amongst, for example, mobile phones, laptops and servers; this introduces issues like battery constraints and the possibility of the computation not completing; these considerations, firmly grounded in the real world, are important when developing MPC techniques in research.


Business applications

A large portion of Google's revenue is generated by advertising: the tech giant, well-known for its aptitude for accurately determining users' desired search results even when queries are expressed ineloquently, specialises in creating personalised adverts to its wide spectrum of users. The efficacy of an advert is generally measured by the proportion of viewers of it who later become customers. Clearly this can be done by businesses comparing their database of customers' transactions with Google's databases of who has been shown which adverts. This, however, would be an invasion of privacy: instead, Google and the business can do MPC: more specifically, a private set intersection protocol.

In a private set intersection protocol, the parties involved compute how large the intersection is amongst the sets input by each party, or even some function on those elements in the intersection. So if the business and Google compute a private set intersection protocol on their data, they can determine how well the advertising went.

Roughly speaking, the MPC Google does in the real world is as follows: Google has a set $\{g_1,g_2,...,g_n\}$ of field elements which encodes a set of people who have been shown an advert for a certain product, and a business has a set $\{b_1,b_2,...,b_m\}$ of field elements which encodes a set of people who have been sold the product in question; Google raises each of its elements to a power $G$ and sends the set $\{g_1^G,g_2^G,...,g_n^G\}$ to the business. The business does the same with its elements for some exponent $B$ to get $\{b_1^B,b_2^B,...,b_m^B\}$, encrypts a set of binary vectors under Paillier encryption (which is additively homomorphic), one corresponding to each element in its set, encoding some other property of the sales (like the amount paid), and also computes the set $\{g_1^{GB},g_2^{GB},...,g_n^{GB}\}$. The business sends Google the set of pairs $\{(b_1^B,P(v_1)),(b_2^B,P(v_2)),...,(b_m^B,P(v_m))\}$ along with $\{g_1^{GB},g_2^{GB},...,g_n^{GB}\}$, and Google computes $\{b_1^{GB},b_2^{GB},...,b_m^{GB}\}$ and adds together all encrypted vectors $P(v_i)$ for which there exists some $j$ such that $g_i^{GB}=b_j^{GB}$. It sends this ciphertext back to the business, which decrypts and interprets the result.

This protocol is very simple, and it is only passively secure (in which players are assumed to execute the protocol faithfully but will possibly try to learn things by inspecting their communication transcripts). An interesting, perhaps somewhat orthogonal concern, to how we approach research from an academic point of view is that it is important that we can convey the security and efficiency of our protocols to lawyers, managers and software engineers who will eventually be sanctioning, authorising or implementing the protocols. "The lawyers are interesting because you can show them a proof, and two plus two equals four is a negotiable statement here... managers usually trust your expertise...and software engineers are the worst because they already assume [the protocol] is impossible."

An alternative solution using garbled circuits was explored in the recent past, but it turned out that their use required some subtle assumptions regarding the computation and communication which would have made the protocol impractical.

Future work would involve getting a (not too much more expensive) maliciously secure protocol and developing the use of the homomorphic encryption to allow different functions to be computed on the data in the intersection.

Consumer applications

The Android keyboard app by Google, Gboard, logs what a user types so that it can guess words for auto-completing in the future. This data could be used for training machine learning models, and merging results from many local models would enable the formation of guessing algorithms that work well for everyone. However, to do this, the server would need to receive a set large dataset of words typed by a user from each phone so that this processing could be done. Clearly there is an issue of privacy here; moreover, there is also potentially a differential privacy issue.

This is clearly a good situation in which to use MPC. Each party masks their data using a basic additive secret-sharing scheme: if each party has a vector to input, for every coordinate, every pair of parties agrees on some random field element, one subtracts and one adds this to that coordinate of their vector. When the parties send this to Google, the masks will therefore cancel when added together.

In practice,they use a PRG and perform a key exchange (in which one key is given to each pair of parties, for every possible pair) at the beginning to achieve the same effect but with much smaller communication overhead. They also have a trick for dealing with device failures (which is important given the application).


This talk provided helpful and relevant insight into the the importance of matching what we research with what we require in the real world, which is, after all, one of the main reasons for having conferences such as Real World Crypto. Many of the talks are available to watch online here, and I would highly recommend doing so if interested.

WhatsApp Backdoor allows Hackers to Intercept and Read Your Encrypted Messages

Important Update — Most Security Experts argued, “It’s not a backdoor, rather it’s a feature,” but none of them denied the fact that, if required, WhatsApp or a hacker can intercept your end-to-end encrypted chats. Read detailed explanation on arguments in my latest article.

Most people believe that end-to-end encryption is the ultimate way to protect your secret communication from snooping, and

Important Update — Most Security Experts argued, "It's not a backdoor, rather it’s a feature," but none of them denied the fact that, if required, WhatsApp or a hacker can intercept your end-to-end encrypted chats. Read detailed explanation on arguments in my latest article. Most people believe that end-to-end encryption is the ultimate way to protect your secret communication from snooping, and

Sophos XG Firewall featured in CRN, Network World

Sophos XG Firewall is getting some great mentions in the media. CRN has deemed it one of the 10 coolest network security products of 2016, while Network World featured it in a weekly roundup of “intriguing new products.” CRN described the new features, support and availability: “Sophos ramped up the capabilities in its XG Firewall […]

XG FirewallSophos XG Firewall is getting some great mentions in the media.

CRN has deemed it one of the 10 coolest network security products of 2016, while Network World featured it in a weekly roundup of “intriguing new products.”

CRN described the new features, support and availability:

“Sophos ramped up the capabilities in its XG Firewall line in December, adding Sophos Sandstorm for zero-day ransomware and targeted threat protection, Secure Web Gateway policy enforcement, and dynamic application traffic identification. The security vendor also extended its Security Heartbeat connection with the XG Firewall, restricting traffic to and from endpoints that it detects have irregular activity, and blocks infected endpoints from communicating with other devices or servers. Sophos also added Microsoft Azure support and updated its firewall rule screen for a better user experience. The next-gen firewall offering is available on-premise or in the cloud.”

sophos-xg-firewall400

Network World’s review said the following:

Sophos XG Firewall with Sophos Sandstorm cloud sandboxing provides protection from zero-day threats like ransomware. New Synchronized Security and Security Heartbeat features strengthen protection and response to advanced threats, automatically identifying and isolating compromised endpoints on networks.”

Network World: new product of the week

Learn more about XG Firewall at Sophos.com, or try the product yourself, sign up for a free trial of XG Firewall here.


Filed under: Corporate, Network, Partners Tagged: CRN, Network World, Sophos XG Firewall

Donald Trump appoints a CyberSecurity Advisor Whose Own Site is Damn Vulnerable

Former New York City Mayor Rudolph W. Giuliani has been appointed as a cyber security advisor for the President-elect Donald Trump, but it appears that he never actually checked the security defenses of his own company’s website.

Giuliani is going to …

Former New York City Mayor Rudolph W. Giuliani has been appointed as a cyber security advisor for the President-elect Donald Trump, but it appears that he never actually checked the security defenses of his own company's website. Giuliani is going to head a new Cybersecurity Working group for the President-elect, and "will be sharing his expertise and insight as a trusted friend concerning

Phone-Hacking Firm Cellebrite Got Hacked; 900GB Of Data Stolen

The company that sells digital forensics and mobile hacking tools to others has itself been hacked.

Israeli firm Cellebrite, the popular company that provides digital forensics tools and software to help law enforcement access mobile phones in investi…

The company that sells digital forensics and mobile hacking tools to others has itself been hacked. Israeli firm Cellebrite, the popular company that provides digital forensics tools and software to help law enforcement access mobile phones in investigations, has had 900 GB of its data stolen by an unknown hacker. But the hacker has not yet publicly released anything from the stolen data

Trump’s CIA Pick Is So Eager To Run The CIA That He’s Willing To Contradict Himself

WASHINGTON ― There aren’t enough Democrats in the Senate to block Rep. Mike Pompeo (R-Kan.) from being confirmed as CIA director, and no Republica…

Read more: Donald Trump, Terrorism, Trump Administration, Law, Political Science, Cia, Torture, National Security, Online Privacy, Dianne Feinstein, Mike Pompeo, Cia Director, Politics News

WASHINGTON ― There aren’t enough Democrats in the Senate to block Rep. Mike Pompeo (R-Kan.) from being confirmed as CIA director, and no Republica...

Read more: Donald Trump, Terrorism, Trump Administration, Law, Political Science, Cia, Torture, National Security, Online Privacy, Dianne Feinstein, Mike Pompeo, Cia Director, Politics News

RWC 2017 – Is Password Insecurity Inevitable?

Fresh back from an enlightening trip across the pond, I wanted to write about one of my favourite talks, all about password (in)security, from this year’s Real World Cryptography conference.As we know: Passwords protect everything.Passwords are terribl…

Fresh back from an enlightening trip across the pond, I wanted to write about one of my favourite talks, all about password (in)security, from this year's Real World Cryptography conference.

As we know:
  1. Passwords protect everything.
  2. Passwords are terrible.
But happily, Hugo Krawczyk from IBM Research spoke about some great new work to resolve these two seemingly incompatible statements. There were a lot of details in the talk that I'll have to miss out here (slides are available online). In particular, I'm going to focus on 'Part I: Take the burden of choosing and memorising passwords off humans'.

The basic idea - this isn't new - is to have the user memorise a single master password that they use to access a password store. Then the password store derives unique pseudorandom passwords for each service the user wants to access (Google, Facebook, etc.) The problem with this solution is that the password store becomes a single point of failure: if it is compromised, then an offline dictionary attack to find the master password will compromise all of the user's accounts at once.

Krawczyk et al. suggest an improvement: SPHINX, which amusingly stands for "a password Store that Perfectly Hides from Itself (No eXaggeration)". The first idea is for the password store to not keep hold of (even a hash of) the master password - instead it has an independent secret key $k$, and any time the user wants to log in to a service $S$, they send the master password $pwd$ to the store, the store computes a PRF $PRF(k, pwd | S)$ and this will be sent to $S$ as the user's password for $S$. This means that if the store is compromised, the master password and the account passwords can't be learned unless the user communicates with the store. So this works well if the store is in local, offline hardware, where the user is unlikely to use the store after it is compromised by an attacker.

However, the authors go further and replace the PRF with an oblivious PRF. This means the store computes an "encrypted" version of $PRF(k, pwd | S)$ from an "encrypted" $pwd|S$, so doesn't learn the plaintext values of the master password or the service password. In practice this can be achieved by the user (i.e. the user's machine) hashing the string $pwd | S$ into an element $g$ of a Diffie-Hellman group, then computing $h = g^r$, where $r$ is a fresh, random exponent, and sending $h$ to the password store. The store's secret key is an exponent $a$, so it computes $h^a$ and sends this back to the user. The user removes the blinding exponent $r$ (i.e. computes $(h^a)^{r^{-1}} = g^a$) and the result is the unique password for $S$. Now even when the password store is compromised and even if the user communicates with the store, the master password and the account passwords can't be learned.

In principle an attacker could recover all the account passwords by compromising both the password store and a service $S$, learning the secret key $a$ and the service password $g^a$, computing $g = H(pwd|S)$ and perfoming an offline dictionary attack to find $pwd|S$. Then for any other service $S'$, the password can be computed via $H(pwd|S')^a$. But as long as $S$ follows good practice and only stores a hash $H'(g^a)$ of the service password, this attack fails: an offline dictionary attack to recover $g^a$ is unfeasible as it's essentially a random group element.

There are no particularly expensive computations involved in using SPHINX, the communication between the user and SPHINX does not need to be secure (so it could be online somewhere) and the store will work regardless of what password protocol is used by the service, so it's extremely flexible. SPHINX therefore strikes me as both useful and practical, which is surely the definition of Real World Cryptography.

Belgian Privacy Commission launches public consultation on its draft data protection impact assessment guidance

The Belgian Data Protection Authority (the “Privacy Commission“) has launched a public consultation about its draft recommendation regarding data protection impact assessments (“DPIA“).

The purpose of the Privacy Commission’s recommendation is to provide companies with answers to practical questions raised by DPIAs. Note that the Article 29 Working Party (WP29) will also publish a recommendation regarding DPIAs in the coming months whose elements will be integrated in the final version of the Privacy Commission’s recommendation.

Stakeholders are invited to submit their comments before 28 February 2017 at commission@privacycommission.be

In its draft recommendation, the Privacy Commission sets out the following requirements that must be met in order to accomplish a DPIA that is compliant with the requirements of article 35(7) of the GDPR:

  • a systematic description of the envisaged processing operations and the purposes of the processing;
  • an assessment of the necessity and proportionality of the processing;
  • an assessment of the risks;
  • the measures envisaged to address the risks.

A systematic description of the envisaged processing operations and the purposes of the processing

The purposes as well as the processing have to be described in a complete, coherent and clear way. This means that general purposes such as “enhancing users’ experience” or “IT security” should be avoided. According to the Privacy Commission, this description has to be drafted in the light of the obligation to keep a record of the processing activities also found in the GDPR.

In addition to the description of the processing activities and the purposes, the DPIA should also cover:

  • the categories of data subjects;
  • the categories of data recipients including recipients outside the EU;
  • data transfers to organisation or countries outside the EU including the document providing adequate safeguards;
  • the retention period for each category of data, if feasible.

An assessment of the necessity and proportionality of the processing

The DPIA has to assess the proportionality and necessity of the processing activity by specifying the reasons why the processing is necessary in itself and why each processing activity is necessary in light of its purpose. If the purpose of processing activity can be achieved in different ways, the Privacy Commission expects the data controller to choose the least privacy-intrusive one. Moreover, the efficiency of the processing has to be assessed.

An assessment of the risks

The Privacy Commission defines risk as ‘the probability that a treat arise and would create a specific impact’.

According to ISO Guide 73:2009, referred to by the Privacy Commission, a risk assessment is the overall process of risk identification, risk analysis and risk evaluation. Where ‘risk identification’ refers to the process of finding, recognizing and describing risks, ‘risk analysis’ refers to the process of understanding the nature of the risks and determining the level of risk. Finally, ‘risk evaluation’ refers to process of comparing the results of risk analysis with risk criteria to determine whether the risk and/or its magnitude is acceptable or tolerable.

When performing a risk assessment, the controller must differentiate between the inherent risk that can be circumscribed and the residual risk that cannot be avoided.

The risks to assess are the high risks to the rights and freedoms of natural persons. According to the WP29, the rights and freedoms of natural persons include the right of privacy as well as other risks to fundamental rights and freedoms established by other legal documents.

Recital 75 of the GDPR refers to specific processing that could create a risk stating among others a processing activity that can lead to discriminations, sensitive personal data processing or minor’s data processing.

Following Recital 76 of the GDPR, the risks assessment has to be a fact-based analysis that assesses the risks in the context of each specific processing.

The data controller is free to choose a methodology to assess the risks, as long as it is objective and safeguards the confidentiality of data. In this context, the Privacy Commission has detailed the minimum requirements of a DPIA in an annex to its guidance. Such requirements include the fact that it must be carried out on the basis of a methodology, in a structured and understandable manner, tailor-made to the specific context and under management supervision. The Privacy Commission recommends using pre-existing proven methodologies instead of creating new ones each time.

The measures envisaged to address the risks.

Furthermore, the DPIA should not only assess the risks but must also explain the risk mitigation measures such as security measures, compliance measures or data safeguard measures.

When is a DPIA required?

As per article 35 of the GDPR, a DPIA is only required when the processing could create high risks to the rights and freedoms of data subjects.  According to the Privacy Commission a “high risks processing” has to be understood as a processing activity that is likely to have significant adverse consequences to the rights and freedoms of data subjects if such data were not correctly processed. This only concerns inherent risks, not residual risks.

Article 35(3) of the GDPR establishes when a DPIA is mandatory no matter the risks the processing activity may create (e.g. in case of ‘profiling’ or the processing on a large scale of sensitive data).

In accordance with article 35 (4) and (5) of the GDPR, the Privacy Commission has also adopted both a white list of processing activities for which no DPIA is needed (Annex 3) and a black list of processing activities that always require a DPIA (Annex 2).

Unsurprisingly, the black list (annex 2) covers processing of biometric data, genetic data and data that allows the profiling of data subjects. However, it also covers other types of processing activities such as:

  • data collected via third party that could lead to a refusal or a termination of the services;
  • data that could compromise the physical health of data subjects in case of a data breach;
  • the processing of financial or sensitive data for secondary purposes for processing that is not based on consent or a legal obligation;
  • data that are publicly disclosed.

The white list (Annex 3) covers notably the following data processing activities:

  • payroll administration;
  • staff management when no health data, sensitive data or data related to criminal offenses are processed;
  • accounting purposes;
  • shareholders and associates management;
  • visitors access control.

 

The Belgian Data Protection Authority (the "Privacy Commission") has launched a public consultation about its draft recommendation regarding data protection impact assessments ("DPIA").

The purpose of the Privacy Commission's recommendation is to provide companies with answers to practical questions raised by DPIAs. Note that the Article 29 Working Party (WP29) will also publish a recommendation regarding DPIAs in the coming months whose elements will be integrated in the final version of the Privacy Commission's recommendation.

Stakeholders are invited to submit their comments before 28 February 2017 at commission@privacycommission.be

In its draft recommendation, the Privacy Commission sets out the following requirements that must be met in order to accomplish a DPIA that is compliant with the requirements of article 35(7) of the GDPR:

  • a systematic description of the envisaged processing operations and the purposes of the processing;
  • an assessment of the necessity and proportionality of the processing;
  • an assessment of the risks;
  • the measures envisaged to address the risks.

A systematic description of the envisaged processing operations and the purposes of the processing

The purposes as well as the processing have to be described in a complete, coherent and clear way. This means that general purposes such as "enhancing users' experience" or "IT security" should be avoided. According to the Privacy Commission, this description has to be drafted in the light of the obligation to keep a record of the processing activities also found in the GDPR.

In addition to the description of the processing activities and the purposes, the DPIA should also cover:

  • the categories of data subjects;
  • the categories of data recipients including recipients outside the EU;
  • data transfers to organisation or countries outside the EU including the document providing adequate safeguards;
  • the retention period for each category of data, if feasible.

An assessment of the necessity and proportionality of the processing

The DPIA has to assess the proportionality and necessity of the processing activity by specifying the reasons why the processing is necessary in itself and why each processing activity is necessary in light of its purpose. If the purpose of processing activity can be achieved in different ways, the Privacy Commission expects the data controller to choose the least privacy-intrusive one. Moreover, the efficiency of the processing has to be assessed.

An assessment of the risks

The Privacy Commission defines risk as 'the probability that a treat arise and would create a specific impact'.

According to ISO Guide 73:2009, referred to by the Privacy Commission, a risk assessment is the overall process of risk identification, risk analysis and risk evaluation. Where 'risk identification' refers to the process of finding, recognizing and describing risks, 'risk analysis' refers to the process of understanding the nature of the risks and determining the level of risk. Finally, 'risk evaluation' refers to process of comparing the results of risk analysis with risk criteria to determine whether the risk and/or its magnitude is acceptable or tolerable.

When performing a risk assessment, the controller must differentiate between the inherent risk that can be circumscribed and the residual risk that cannot be avoided.

The risks to assess are the high risks to the rights and freedoms of natural persons. According to the WP29, the rights and freedoms of natural persons include the right of privacy as well as other risks to fundamental rights and freedoms established by other legal documents.

Recital 75 of the GDPR refers to specific processing that could create a risk stating among others a processing activity that can lead to discriminations, sensitive personal data processing or minor's data processing.

Following Recital 76 of the GDPR, the risks assessment has to be a fact-based analysis that assesses the risks in the context of each specific processing.

The data controller is free to choose a methodology to assess the risks, as long as it is objective and safeguards the confidentiality of data. In this context, the Privacy Commission has detailed the minimum requirements of a DPIA in an annex to its guidance. Such requirements include the fact that it must be carried out on the basis of a methodology, in a structured and understandable manner, tailor-made to the specific context and under management supervision. The Privacy Commission recommends using pre-existing proven methodologies instead of creating new ones each time.

The measures envisaged to address the risks.

Furthermore, the DPIA should not only assess the risks but must also explain the risk mitigation measures such as security measures, compliance measures or data safeguard measures.

When is a DPIA required?

As per article 35 of the GDPR, a DPIA is only required when the processing could create high risks to the rights and freedoms of data subjects.  According to the Privacy Commission a "high risks processing" has to be understood as a processing activity that is likely to have significant adverse consequences to the rights and freedoms of data subjects if such data were not correctly processed. This only concerns inherent risks, not residual risks.

Article 35(3) of the GDPR establishes when a DPIA is mandatory no matter the risks the processing activity may create (e.g. in case of 'profiling' or the processing on a large scale of sensitive data).

In accordance with article 35 (4) and (5) of the GDPR, the Privacy Commission has also adopted both a white list of processing activities for which no DPIA is needed (Annex 3) and a black list of processing activities that always require a DPIA (Annex 2).

Unsurprisingly, the black list (annex 2) covers processing of biometric data, genetic data and data that allows the profiling of data subjects. However, it also covers other types of processing activities such as:

  • data collected via third party that could lead to a refusal or a termination of the services;
  • data that could compromise the physical health of data subjects in case of a data breach;
  • the processing of financial or sensitive data for secondary purposes for processing that is not based on consent or a legal obligation;
  • data that are publicly disclosed.

The white list (Annex 3) covers notably the following data processing activities:

  • payroll administration;
  • staff management when no health data, sensitive data or data related to criminal offenses are processed;
  • accounting purposes;
  • shareholders and associates management;
  • visitors access control.

 

Browser AutoFill Feature Can Leak Your Personal Information to Hackers

Just like most of you, I too really hate filling out web forms, especially on mobile devices.

To help make this whole process faster, Google Chrome and other major browsers offer “Autofill” feature that automatically fills out web form based on data y…

Just like most of you, I too really hate filling out web forms, especially on mobile devices. To help make this whole process faster, Google Chrome and other major browsers offer "Autofill" feature that automatically fills out web form based on data you have previously entered in similar fields. However, it turns out that an attacker can use this autofill feature against you and trick you

Secure Your Enterprise With Zoho Vault Password Management Software

Recent data breaches have taught us something very important — online users are spectacularly bad at choosing their strong passwords.

Today majority of online users are vulnerable to cyber attacks, not because they are not using any best antivirus or other security measures, but because they are using weak passwords that are easy to remember and reuse same passwords on multiple accounts and

Recent data breaches have taught us something very important — online users are spectacularly bad at choosing their strong passwords. Today majority of online users are vulnerable to cyber attacks, not because they are not using any best antivirus or other security measures, but because they are using weak passwords that are easy to remember and reuse same passwords on multiple accounts and

Microsoft Releases 4 Security Updates — Smallest Patch Tuesday Ever!

In Brief
Microsoft has issued its first Patch Tuesday for 2017, and it’s one of the smallest ever monthly patch releases for the company, with only four security updates to address vulnerabilities in its Windows operating system as well as Adobe Flash …

In Brief Microsoft has issued its first Patch Tuesday for 2017, and it's one of the smallest ever monthly patch releases for the company, with only four security updates to address vulnerabilities in its Windows operating system as well as Adobe Flash Player. Meanwhile, Adobe has also released patches for more than three dozen security vulnerabilities in its Flash Player and Acrobat/Reader