Privacy Engineering
Brings together the disciplines of privacy professionals and software engineers together. It is a specialty discipline of systems engineering focused on achieving freedom from conditions that can create problems for individuals with unacceptable consequences that arise from the system as it processes PII. There are three objectives:
Data Governance
This concepts understands personal and non-personal data, how each is used and privacy risks, safeguards aligned with privacy objectives, create a common taxonomy for data, identify business objectives for data, know the laws and policies and implement technology.
Technological Controls
Technology-centric privacy governance, link or translate internal controls into technology, privacy engineering is a result of this, translated technological controls which include access or limiting users, minimizing data and deleting older data.
Engineering life cycle
SDLC is a proccess used by engineers to design, develop, test and maintain a system. The stages of SDLC include Planning (goals are defined and decisions on what to build), Design (Create a blueprint), Development (Coding), Testing, Deployment, and Maintence.
For a privacy expert, implementation of privacy-protective solutions into the engineering life cycle, embedding privacy into the engineering life cycle, which includes translates privacy into the engineering culture and natural enforcement of privacy safeguards with technology solutions. In simple terms, data privacy should be considered at every stage.
Design Pattern
Describes shared solutions to recurring problems. There are four elements of a design pattern:
o Pattern Name: References the pattern
o Problem Description: Describes what is intended to be solved
o Solution: Describes the elements of the design, their relationships, roles and interactions
o Consequences: Results from applying the pattern and any trade-offs that occur.
Dark Pattern
Solutions that manipulate individuals to give up information.
Roach Motel
A Dark pattern where a user easily gets into a situation but then can not get out. An example would be A streaming service allows users to sign up for a free trial with just a few clicks and very minimal information. However, when the user wants to cancel their subscription before being charged, the service:
Privacy Zukering
A Dark Pattern where privacy settings are made complex for the end-user by poorly presenting the available settings, encouraging users to reveal more information than intended. An example of this would be a social media platform.
Sneak into Basket
A Dark Pattern where when a user makes a purchase online, the site sneaks an additional item into your basket. An example of this would be:
A travel booking website allows a user to select a flight and proceed to the checkout page. However, during the booking process:
A travel booking website allows a user to select a flight and proceed to the checkout page. However, during the booking process:
Trick questions
Users are presented with misleading, confusing, or ambiguous language in forms or settings, often causing them to unintentionally agree to something they don’t want or understand. An example of this is A newsletter sign-up form on an e-commerce website includes the following checkbox:
[ ] I do not want to receive promotional emails.
This is confusing because users may quickly glance at the checkbox and think they are agreeing not to receive emails, but leaving it unchecked means they are actually agreeing to receive promotional content. The negative phrasing tricks the user into inadvertently opting in to marketing emails.
Alternatively, there could be two checkboxes:
[ ] Yes, I want to receive offers and promotions.
[ ] No, I don’t want to miss out on offers and promotions.
Both options are framed to pressure the user into agreeing to receive emails, making it unclear how to opt out entirely, thus tricking users into subscribing.
Price Comparison Prevention
Where a website or service deliberately makes it difficult for users to compare prices between similar products or services, preventing them from finding the best deal. An example of this is an online electronics store sells multiple models of smartphones. However, the store:
As a result, users cannot easily assess which option is the best deal and may end up paying more for a product that appears similar but is actually less valuable or more expensive due to hidden costs.
Misdirection
It is where a website or service deliberately focuses the user’s attention on one thing to distract them from something else, often leading them to take actions they might not want or intend to. An example of this is When a user tries to unsubscribe from an email service, they click on the “Unsubscribe” link. This link takes them to a page with two large, brightly colored buttons:
*“Stay Subscribed” (in bold, attention-grabbing colors)
*“Unsubscribe” (small, dull, or hidden in a less obvious part of the page)
The design and layout are meant to draw the user’s eye toward the “Stay Subscribed” button, making it easy to accidentally stay signed up while the actual “Unsubscribe” button is hard to find or seems less appealing. This misdirects the user’s attention, influencing them to make a choice that benefits the service rather than what they intended to do.
Hidden Costs
Is where unexpected fees or charges are added late in the purchasing process, often just before completing a transaction, surprising the user and potentially increasing the final cost. An example of this would be an online retailer advertises a product at a very competitive price. However, when the user proceeds to checkout:
As a result, the user feels committed to completing the purchase despite the unexpected extra costs because they have already invested time in the process, or they may not realize they were misled until the very end.
Bait and switch
Is where a user is lured into taking a specific action with the promise of one result, but then they are presented with something different, often less desirable or more costly. An example of this would be an e-commerce site advertises a popular product, like a laptop, at a deep discount to attract users. When a user clicks on the offer and adds the product to their cart, they receive a notification that the laptop is “out of stock.” However, the website then immediately offers a similar but more expensive model as an alternative, pushing the user to buy it instead.
The original offer (the “bait”) was never truly intended to be fulfilled, and the user is switched to a higher-priced item, feeling pressured or misled into spending more.
Confirmshaming
Is where users are guilt-tripped or shamed into taking an action, often by phrasing the alternative or opt-out option in a way that makes the user feel bad for not agreeing. An example of this is A website pop-up asks users to sign up for a newsletter with the following options:
[ ] Yes, I want to stay updated with the latest news and offers!
[ ] No, I don’t care about saving money.
By framing the opt-out option in a way that makes the user feel irresponsible or foolish, the website tries to pressure users into subscribing. The intention is to make the user feel guilty for saying “no” by implying they are missing out or being indifferent to valuable opportunities.
Disguised Ads
Is where advertisements are made to look like regular content, such as articles, product listings, or user-generated content, in order to trick users into clicking on them, believing they are something else. An example of this where a news website displays a list of articles on its homepage. Mixed in with the real articles, there are sponsored posts or ads that are styled exactly the same as the articles, with only a small, hard-to-notice label like “Sponsored” or “Ad.”
Users might click on these thinking they are reading a genuine article, but instead, they are taken to a promotional page or third-party site. The ad is disguised to appear like regular content, making it harder for users to distinguish between genuine information and advertisements.
Forced continuity
Is where users are signed up for a service with a free trial or discounted period, but after the trial ends, they are automatically charged without being clearly reminded or given an easy way to cancel before the billing begins. An example of this is when A streaming service offers a 30-day free trial where users sign up by entering their credit card information. However, before the trial ends, there is:
This forces users into paid subscriptions, often without clear consent or easy cancellation options, trapping them in the service longer than intended.
Friend Spam
Is where a service tricks or pressures users into giving it access to their contact list, and then uses that access to send promotional messages or invites to the user’s friends without clear consent. An example of this would be a social media app encourages users to “Find friends” by prompting them to connect their email or phone contacts. However, the app doesn’t make it clear that by doing so:
As a result, friends of the user receive unsolicited messages, potentially causing embarrassment or frustration for the user, as it appears they intentionally spammed their contacts.