Tag: general

  • The end of the privacy shield: what next for international data transfers?

    The end of the privacy shield: what next for international data transfers?

    Earlier this month, the Court of Justice of the European Union issued a judgment that will have major implications for all businesses which transfer personal data internationally.

    This isn’t just a matter for multinationals or tech companies; international transfers are crucial for all sorts of businesses, large and small. They can happen when businesses store data in the cloud, send data to other organisations or engage suppliers based outside of Europe.
    The latest decision came in the long-running legal battle between Austrian privacy campaigner Max Schrems and social media giant Facebook, which has already had a huge impact on international transfers of personal information. Back in 2013, while he was still a student, Mr Schrems made a complaint against Facebook.
    His complaint arose from the revelations of whistle-blower Edward Snowden, which revealed that US authorities routinely intercepted and retained information from social media companies. A case was brought in Ireland, where Facebook has its EU headquarters, and related cases have been proceeding through the courts ever since.
    The complaint revolves around the validity of transfers of personal data from the EU to the US. The General Data Protection Regulation, like its predecessor the 1995 Data Protection Directive, contains a broad prohibition on the transfers of personal data outside the EU. However, this prohibition can be overcome in various ways.
    The most popular of these are where the transfer is to a country which the European Commission has decided gives adequate protection to personal data (a so-called ‘adequacy decision’), or where the data exporter and the data importer agree to a contract containing European Commission-approved standard contract clauses. Both of these methods were under scrutiny in this case.
    Mr Schrems’ original case led to a ruling in 2015 that the previous ‘Safe Harbor’ framework for data transfers to the US did not offer adequate protection for individuals in Europe.
    The latest case has moved on to consider the validity of both the standard contractual clauses and the replacement for Safe Harbor, the EU/US Privacy Shield, which in reality is a partial adequacy decision for certain companies in the US. Mr Schrems argued that neither the EU/US Privacy Shield nor the standard contractual clauses offered adequate protection to his data once it had been transferred to the US, because of the wide powers of US authorities over the personal data of non-US citizens.
    In the most eye-catching part of the judgment, the Court ruled that the EU/US Privacy Shield does not offer appropriate safeguards for data protection, because of the US government’s wide powers to collect and review personal data held in its jurisdiction. Accordingly, the Court annulled the adequacy decision in respect of the EU/US Privacy Shield.
    Data transfers under that framework will no longer be valid. As with the similar ruling in 2015 in respect of Safe Harbor, the EU Commission and US authorities may try again to find a replacement scheme, but this appears increasingly difficult, particularly in light of the existing US administration’s increasingly protectionist agenda.
    Perhaps more importantly, however, the Court also ruled on the use of standard contractual clauses, which can be used to transfer data anywhere in the world, not just to the US. To the huge relief of many businesses, the Court upheld the use of standard contractual clauses as a means of validating transfers outside the EU.
    But in doing so, the Court emphasised that putting in place standard contractual clauses alone is not enough to ensure adequate protection. Instead, data exporters must also consider the legal context in the recipient country. Where the laws of the recipient do not provide adequate protection, the use of standard contractual clauses is not enough, and the data exporter must not transfer the data.
    So what does all of this mean for businesses? In some ways, we’ve been here before. In respect of the Privacy Shield, the current situation is almost identical to 2015, when the earlier judgment annulled the Safe Harbor framework. At that time, European regulators urged a cautious approach and emphasised that businesses should not immediately stop transferring data, which could itself have a negative impact on individuals.
    But that was under the old regime, before the General Data Protection Regulation and the significant strengthening of data protection rules.
    The UK regulator, the Information Commissioner’s Office, has again taken a cautious approach and stated that, at least for now, businesses can continue existing transfer arrangements using Privacy Shield, but should not start new transfers under the now-defunct framework. Other European regulators have taken a stronger approach and recommended businesses switch now to an alternative method of transfer or stop exporting data altogether.
    Any businesses that transfer personal data to the US using the Privacy Shield framework would be wise to immediately take stock. They should assess the situation to understand the scale of the issue and consider what steps to take to remove any data protection risk.
    This may involve using another method to validate those data transfers or considering whether alternative solutions exist. But they should be careful not to simply stop data transfers on the basis of this judgment, without taking into account all of the potential wider consequences.
    The use of standard contractual clauses should also be reviewed. This decision means that international data transfers are likely to become subject to much greater scrutiny and will potentially become more difficult. And with the post-Brexit transition period ending on 31 December 2020, data transfers between the EU and the UK will become subject to these strict rules from next year. Now really is the time for businesses to be reviewing all of their international data flows.

  • How Long Does Rehab Take? Understanding Recovery Timeframes in Australia – Health Cages

    How Long Does Rehab Take? Understanding Recovery Timeframes in Australia – Health Cages

    If you’re considering rehab for alcohol or drug use, one of the first questions you might ask is: how long will it take? While there’s no universal answer, having a general idea of program lengths and the factors that influence them can help you prepare for the journey ahead.

    (adsbygoogle=window.adsbygoogle||[]).push({})

    This guide explains the common timeframes for rehab in Australia, what influences treatment duration, and why flexibility is key to long-term recovery.

    (adsbygoogle=window.adsbygoogle||[]).push({})

    Typical Rehab Timeframes in Australia

    Rehabilitation programs are designed to meet individual needs. Some people benefit from shorter stays, while others require more time and support. Below is a general overview of the most common treatment durations:

    (adsbygoogle=window.adsbygoogle||[]).push({})
    • Detox Programs: 5 to 10 days
      Often the first step, detox supports safe withdrawal from substances such as alcohol, benzodiazepines or opioids. It is typically conducted in a supervised environment.
    • Short-Term Rehab: 2 to 4 weeks
      This may suit individuals in the early stages of addiction or those unable to step away from family or work commitments for extended periods.
    • Medium-Term Rehab: 4 to 8 weeks
      Provides more time for therapeutic work, skill-building and self-reflection. A good fit for people needing deeper support but not requiring long-term care.
    • Long-Term Rehab: 2 to 3 months or more
      Recommended for individuals with long-standing addiction, complex trauma or co-occurring mental health conditions. This format allows for more comprehensive care and relapse prevention planning.

    What Affects How Long Rehab Should Last?

    There are several factors that influence how long someone should stay in rehab:

    (adsbygoogle=window.adsbygoogle||[]).push({})
    • Type of substance and length of use
    • Whether there are underlying mental health concerns
    • How quickly the person responds to treatment
    • Support available at home or in the community
    • The individual’s goals and motivation

    It’s also important to note that completing a shorter program fully can be more beneficial than starting a longer one and not following through.

    The Value of Flexibility in Recovery

    Everyone’s recovery journey is different. Some people benefit from focused, shorter programs that offer clarity and structure, while others need time and space to gradually work through deeper issues.

    (adsbygoogle=window.adsbygoogle||[]).push({})

    What matters most is choosing a rehab facility that tailors treatment to the person — not one that prescribes a fixed timeframe regardless of the situation.

    Flexible Programs at Southern Highlands Addiction Retreat

    Southern Highlands Addiction Retreat offers private, customised treatment plans designed to meet you where you are. Located in Canyonleigh, in the Southern Highlands of NSW, the retreat is a peaceful and supportive environment just two hours from both Sydney and Canberra.

    Program durations are flexible, ranging from short-term detox and rehab to extended stays for deeper healing. Treatment may include:

    • Medically supervised detox
    • One-on-one and group therapy
    • Holistic and evidence-based treatment approaches
    • Support with superannuation access to fund care (learn more about costs)

    If you’re unsure how much time you’ll need, the team can help assess your goals and recommend the right path.

    Southern Highlands Addiction Retreat is located in the heart of the Southern Highlands
    region of New South Wales, just a two hour drive from both Sydney and Canberra.

    Visit the contact page or call 0485 880 816 to speak with the team.

    Recovery doesn’t start with all the answers. It starts with one honest question, and the willingness to ask for support.

     

  • Startup founders are busy, but they still need to prioritise privacy

    Startup founders are busy, but they still need to prioritise privacy

    Privacy is a human right. With all the reasons startups fail, ensure privacy isn’t one of them.

    Nigel Jones, ex-Google lawyer and now co-founder of the Privacy Compliance Hub, says the early days of founding a growing startup are the perfect time to get your ducks in a row when it comes to privacy compliance
    It’s easy to get caught up in the fast-paced world of building a startup. Most entrepreneurs start with the spark of an idea, or a problem they think they can solve. They hire engineers to build a product. They start attracting customers and/or clients and make plans for future growth, investment and features on the product roadmap. And they begin to collect data. But it’s rare that they’ve stopped for a second to think about privacy.

    That’s a problem. Because privacy really matters.

    And it’s down to businesses of all sizes and from all sectors to protect this fundamental human right.
    Making a public commitment to privacy is good for business too. A whopping 92% of the British public say they feel uncomfortable about the number of businesses that collect data about them, and 41% say they’ll never return to a business after a breach. Investors are interested as to whether companies are complying with privacy laws, and place emphasis on this factor when doing due diligence. Falling foul of the UK General Data Protection Regulation (GDPR) can, after all, cause significant reputational damage and come with heavy penalties. The Information Commissioner’s Office (ICO) has the power to fine a company up to 4% of its worldwide turnover, or £17.5m (whichever is higher), for breaches of the UK GDPR. It has also recently gone public with its intention to name and shame companies moving forward.

    With that in mind, here’s how startup leaders can prioritise privacy from day one.

    Focus on your people

    It’s easier to build an effective privacy culture when you’re overseeing a team of 50 rather than 500. And with 88% of data breaches down to human error, it makes sense to centre your privacy programme on the people within your organisation. Make sure there’s a comprehensive training strategy in place, with frequent refresher sessions. That’s particularly important with the shift to hybrid working, as phishing attacks are increasing in frequency and complexity. Once your team is big enough, appoint privacy champions in every department to keep compliance on the agenda. When employees understand privacy, they care about it and are willing to play their part in protecting it every day.

    Consider your processes

    The influx of Big Data has opened up endless opportunities for innovation in the startup world. But it does become problematic when it comes to privacy, not least because many businesses quickly become overrun by information. Start with an audit of what personal data the business collects, how it’s processed, where it is kept (and for how long), and what happens to it when it’s no longer needed. This exercise will help you streamline workflows to ensure that data is being processed in accordance with the law at every stage. You’ll also have the information you need to be transparent with customers about the data you’re collecting and why – a key requirement of the UK GDPR.

    Be careful about sharing data externally

    It’s a fact of modern day business that organisations increasingly share data with each other. But the UK GDPR requires you to only share personal information with companies that take privacy as seriously as you do. If one of your partners has a sloppy approach to compliance, which leads to a data breach that affects your customers, you risk a hefty fine and reputational damage. Ask the question whether it’s necessary for personal information to be shared externally at all. If it is, make sure your team is doing the necessary due diligence and that there’s an appropriate agreement in place before you start sharing data with another organisation. The buck always stops with you, even if a breach is solely down to your partner’s actions.

    Get the executive team on board

    Too often, privacy is seen as the responsibility of an IT or legal lead and not something that involves the entire organisation. Employees are more likely to follow your lead if you make it clear that this is something you and the rest of the leadership team cares about. Give privacy a seat at the top table by adding it to the agenda of board meetings, and appoint a key person to take ownership of driving progress forward. Someone needs to be able to look ahead and ask, what are the implications of what we’re building – in the short, medium and long term? It’s always better to build well in the first instance, rather than try to mend the dam after it’s sprung a leak. In fact, it’s a legal requirement under the UK GDPR.

    Commit to developing a culture of continuous privacy compliance in the long term

    Privacy isn’t a tick-box exercise that’s over before it’s begun. It’s an ongoing effort that will become part of your startup’s culture. Getting privacy right in the early days means customer data will be kept safe and treated with the respect it deserves as the business grows and adapts. That boosts innovation – when employees know exactly what they can and can’t do with data, they feel empowered to act. It builds your reputation as an ethical company, among customers and your future talent pool. And it puts you in the best place to expand into new markets or services, and scale faster than your competitors.
    Keen to make sure you’re compliant? Take your free 10-minute GDPR health check here.

  • North Korea's Kim Jong-un to attend Beijing military parade with Putin

    North Korea's Kim Jong-un to attend Beijing military parade with Putin

    It will be the first time that Kim has appeared at a multilateral event since assuming power in 2011.

    ADVERTISEMENT

    North Korean ruler Kim Jong-un will attend the first multilateral event of his premiership when he appears beside international leaders including Russian President Vladimir Putin at a military parade in Beijing next week.
    China’s foreign ministry confirmed on Thursday that Kim would travel to the Chinese capital for celebrations on 3 September to mark the 80th anniversary of Japan’s surrender at the end of World War II.

    “We warmly welcome General Secretary Kim Jong-un to China to attend the commemorative events,” said Hong Lei, China’s assistant minister of foreign affairs.
    KCNA, North Korea’s state news agency, also said that Kim would be in attendance at the “Victory Day” parade, which is set to include a procession of tens of thousands of Chinese soldiers and a display of the country’s latest weaponry.
    Although Kim has held bilateral meetings with world leaders including US President Donald Trump since assuming power in 2011, he has never taken part in multilateral proceedings before.
    In total, 26 foreign leaders are expected at the event in Beijing, including Cuban President Miguel Díaz-Canel, Indonesian President Prabowo Subianto and Myanmar’s military ruler Min Aung Hliang.
    Given the EU’s opposition to Putin’s war in Ukraine, there will be little European representation at the parade.

    Slovak Prime Minister Robert Fico, who unlike most of his European peers has maintained a relationship with Russia, will be the only leader from the bloc in attendance.
    Kim’s trip to Beijing will mark his first in-person meeting with Chinese President Xi Jinping since June 2019, when the pair met in Pyongyang. Earlier that year, the North Korean leader travelled to China for talks.

    Related

    Kim Jong-un and Putin discuss alliance and war efforts against UkraineRussia launches first commercial passenger flights to North Korea in decades

    China is one of Pyongyang’s principal allies, with North Korea relying on Beijing for the majority of its external trade.

    The isolated country’s other chief ally is Russia. Since the start of Moscow’s all-out war in Ukraine, relations between the Kremlin and Pyongyang have deepened, with the pair reviving a Cold War defence treaty last June.
    North Korea has sent thousands of its troops to Russia’s Kursk province to fight off a Ukrainian incursion there. In exchange, experts believe Moscow is helping its ally with food, oil and weapons technology.
    Leif-Eric Easley, professor of international studies at Ewha Womans University in Seoul, said that North Korean ties with China have been affected by the move.
    “Pyongyang’s illicit cooperation with Moscow has strained ties with Beijing, even as China’s political and economic support remains vital for the North Korean regime,” Easley said.

  • EU’s AI Act Fails to Safeguard Artists’ Copyright, Creatives Claim

    AI Act Hits the Real World – Creative Minds Hold Their Breath

    As the new AI Act makes its grand entrance this weekend, the creative community is holding its collective breath. They’re hoping the fresh legislation will finally put a stop to the wild and wacky ways AI folks have been training on their copyrighted content.

    What’s the Big Deal?

    Picture this: an AI system whacking through millions of songs, movies, and paintings to learn a few tricks and then, boom! It spits out a brand‑new track or a masterpiece that looks eerily similar to a famed artist’s work. The creative folks are up in arms.

    The Roadblocks Still Standing

    • Copyright & Consent Jumbles: The law isn’t crystal clear yet on what counts as “fair use” when it comes to training data.
    • Informed Consent Snafu: Companies aren’t always sure if the content they use has the green light from rights holders.
    • Licensing Labyrinth: Securing solid licenses for all that content is a maze that’s not exactly beginner-friendly.
    • Enforcement Woes: Even if the rules are set, keeping tabs on every AI model worldwide is a tall order.

    Why It Matters to Creators

    When AI copies a slice of a copyrighted work, the original creator gets less recognition and potentially less revenue. That’s not just a legal issue—it’s a compassion issue. Everyone wants their creativity to be respected and rewarded.

    Bottom Line: A Call for Clear Rules

    While the AI Act is a good step forward, the creative circle still craves bullet‑proof protections. They’re pushing for laws that give them a voice, a say, and a fair paycheck when their art is used in the training of AI.

    Artists Raising the Alarm: The AI Act Still Leaves a Few Ticks Unchecked

    What the AI Act is Trying to Do

    With the European Artificial Intelligence Act finally kicking off, lawmakers aimed for a sweeping safety net that could stand against a world of AI that’s growing faster than you can say “algorithm.”

    Why the Creative Community Is Still Frowning

    • Loopholes everywhere – musicians, writers, filmmakers, and visual artists feel they’re walking through a minefield.
    • No opt‑out or pay‑out guarantees – the act doesn’t make it easy for creators to say “No thanks” or to get a paycheck when their work gets fed into AI training models.
    • Opaque use of art – AI models that learn from music, books, and movies can do so without clear transparency or consent. That’s a recipe for creative infringement.

    Voices from the Front Lines

    Marc du Moulin, the Secretary‑General of the European Composer and Songwriter Alliance (ECSA), sums it up: “The work of our members should not be used without transparency, consent, and remuneration. We see that the implementation of the AI Act does not give us that.”

    What Happens If the Act Doesn’t Get Fixed?

    Without a solid framework, artists risk having their creative output turned into training data for generative AI without ever seeing a royalty check. The art world might become a playground for AI, but at what cost to the creators who built the playground in the first place?

    In Plain English

    Simply put, the current AI Act is a promising start, but it still misses a few targets. Artists demand clearer rules that protect their art, let them opt out when they want to, and ensure they’re paid when their work is used to train robots. Until those gaps are closed, the creative community will keep pushing for a more concrete solution.

    ‘Putting the cart before the horse’

    EU’s AI Act: A New Playbook for Safer, Fairer, and More “Vanilla” Tech

    The European Commission has drafted the AI Act to keep artificial intelligence safe, transparent, traceable, non‑discriminatory, and environmentally friendly. Think of it as the EU’s way of putting a stern hand on the wild rabbit of rapid tech growth.

    Risk Levels: From “Practice‑worthy” to “Wait, That’s a Bad Idea!”

    • Minimal risk – Most chatbots and image generators fall here. But even if you’re in the low‑risk lane, you still need to publish a quick rundown of the copyrighted data you used to train your AI.
    • Limited risk – A step up, needing more stringent safeguards.
    • High risk – Technologies that could influence people’s decisions or safety (think election bots).
    • Unacceptable risk – Already banned. Examples: manipulative AIs or those that do social scoring—ranking people by their behavior or economic status.

    Why the “Minimal” Category Isn’t a Pass‑No‑Question Pass

    Even if your AI is deemed minimal risk, you’re not completely off the hook. The EU still wants you to maintain a “copyright policy”–a promise that you’ll respect creators’ rights and have a safe space for complaints. If your tech loves the public domain, that’s great—just make sure those creators can opt‑out.

    The Copyright Conundrum

    Under EU law, companies can harvest text and data for AI training unless a creator has “reserved their rights.” But how can an artist actually say, “I’m not giving my art to the AI crowd?” Du Moulin, an expert on the matter, says the process is murky:

    “This whole conversation is putting the cart before the horse. You don’t know how to opt out, but your work is already being used.”

    So if you’re an artist concerned about your works ending up in a GPT‑trained dataset, you might feel a bit “legally blindsided.”

    Voluntary Code of Practice: A Gentle Nugget of Self‑Governance

    The EU introduced the AI Code of Practice for General‑Purpose AI (GPAI), a voluntary yet pretty heavy‑handed agreement. Its key points:

    • Commit to a well‑drafted copyright policy.
    • Install safeguards to avoid rights violations.
    • Set up a dedicated area to receive and process complaints.

    Who’s Signing Up?

    So far, the signatories include tech giants like Amazon, Google, Microsoft, and even OpenAI. It turns out that even the biggest players find it cheaper to cooperate than to fight the regulator every time a new AI comes online.

    Related

    Is Europe ready to police AI? Supervision and sanctions are on the horizon.

    AI providers have to respect copyright laws, the Commission says

    How the AI Act’s New Transparency Rules Leave Artists Feeling Left Behind

    Under the AI Act, there’s now a “who‑scraped‑my‑work” tracker that promises to give artists a better idea of where their art ends up. But for those of us who’re actually hoping for a slice of royalties, the reality hits harder than a stale baguette.

    What the Act Brings to the Table

    “With these new transparency requirements, artists can finally see whose turn it was to shout ‘mine!’”,du Moulin says. He concedes, however, that the Act is a future‑only safeguard. In plain English: whatever got captured before the law took effect is still free for anyone to feed into an AI model.

    Artists’ Struggle to Get Paid

    • Adriana Moscono – The general manager at GESAC – tried to get a license by sending letters and emails to the big names in AI. The outcome? A polite shrug, or worse, a complete silence.
    • “There was no answer,” Moscono told Euronews Next. “The big guns not only shut the door on copyright recognition but also ignored our pleas for fair compensation.”

    In short, many artists feel like they’ve been given a free lunch that’s hard to crunch.

    Response from the European Commission

    • Thomas Regnier, spokesperson for the Commission, weighed in. He noted that AI firms must respect rights holders when they pull data for training. If an infringement happened, it can be settled privately.
    • Regnier added, “The AI Act does not override existing EU copyright laws.”

    Why the Rulebook Isn’t Enough

    Despite the transparency push, artists find themselves at the mercy of AI companies that claim the market is still a “ground zero.” By avoiding the back‑door licensing route, they’re forced into the grim reality that many AI models have a history of using art without asking.

    Looking Ahead

    As the AI Act rolls out, artists remain hopeful that the future can bring retrievable rights and actual payments. But for now, the transparency tools might be a fancy badge rather than a tangible benefit.

    Mandate licence negotiations, groups ask

    When Robots Take the Stage: A Copyright Showdown

    Picture this: a bunch of music‑law heavy‑weights—Du Moulin and Ms. Moscono—stamp their boots on the Commission’s doorstep, demanding a clearer play‑by‑play on how artists can opt out of the AI frenzy and get the copyright protection they deserve.

    “We’re Not Getting a Ticket to the AI‑Act Fair Anyway!”

    “The code of practice, the template and the guidelines, they do’t give us even a map,” Moscono told reporters, sounding as if the AI dance floor is bumping against a wall that never lets anyone see the exit. “They don’t guarantee a proper application of the AI Act,” she added with a throw‑away side‑wink.

    Collective Licenses: A New Strategy?

    • Option A: The Commission could obligate AI firms to negotiate blanket licenses with a coalition of artists.
    • Option B: They might require “collective” agreements—imagine a concert where every guitarist signs on at once.

    GEMA Goes Full‑Berserker on OpenAI and Suno AI

    While the music‑rights giant GEMA’s two lawsuits against OpenAI (the squad behind ChatGPT) and Suno AI (the app that cooks up tunes out of thin air) are technically “outside the AI Act,” Du Moulin insists the verdict will set a precedent for how hard AI companies can be pushed by copyright laws.

    EU’s Two-Month Deadline Clock

    New AI outfits have a ticking clock: by 2026 they must be compliant with the AI Act’s regulations (14–2026). Already active in the EU? They just have a year extra, so until 2027.

    But What About the Text & Data Mining Clause?

    The Commission and the EU’s high court—the European Court of Justice—have already hinted that they’re going to take a fresh look at the 2019 copyright law’s text and data mining exemption. This could either loosen or tighten the rules for AI nestled in the data‑driven ecosystem.

    Take‑away: Artists, the Commission, and AI companies are in a tug‑of‑war—only the legal lasso will decide who holds the winner’s trophy.