Legal Archive

“TCF” cookie consent popups violate GDPR; OSNews wants to stop using cookie popups too once we get enough Patreons

You may not have heard of the “Transparency & Consent Framework”, but you’ve most likely interacted with it, probably on a daily basis. The TCF is used by 80% of the internet to obtain “consent” from users to collect their data and share it among advertisers – you know, the cookie popups. In a landmark EU ruling yesterday, the TCF has been declared to violate the GDPR, making it illegal. For seven years, the tracking industry has used the TCF as a legal cover for Real-Time Bidding (RTB), the vast advertising auction system that operates behind the scenes on websites and apps. RTB tracks what Internet users look at and where they go in the real world. It then continuously broadcasts this data to a host of companies, enabling them to keep dossiers on every Internet user. Because there is no security in the RTB system it is impossible to know what then happens to the data. As a result, it is also impossible to provide the necessary information that must accompany a consent request. ↫ Irish Council for Civil Liberties It’s no secret that cookie consent popups do not actually comply with the GDPR, and that they are not even necessary if you simply don’t do any cross-site sharing of personal information. It seems that this ruling confirms this in a legal sense, forcing the advertising industry to come up with a new, better system. On top of that, every individual company that participated in this scheme is now liable for fines and damages. Complaints coordinated by Johnny Ryan, Director of Enforce at the Irish Council for Civil Liberties, prompted the ruling. He said: Today’s court’s decision shows that the consent system used by Google, Amazon, X, Microsoft, deceives hundreds of millions of Europeans. The tech industry has sought to hide its vast data breach behind sham consent popups. Tech companies turned the GDPR into a daily nuisance rather than a shield for people. ↫ Irish Council for Civil Liberties The problem here is not so much the clarity of applicable laws and regulations, but the cost and effectiveness of enforcement. If it takes years of expensive and complex legal proceedings to bring a company that violates the GDPR to heel, is it really an effective legal framework? Especially when you take into account just how many companies, big and small, there are that violate the GDPR? OSNews uses a cookie popup and displays advertising, something we have to do to gain a little bit of extra income – but I’m not happy about it. Our ads don’t provide us with much income, perhaps about €150-200, but that’s still a decent enough chunk of our income pie that we need it. I would greatly prefer we turn off these ads altogether, but in order to be able to afford that, we’d need to up our Patreon income. OSNews Patreons get an ad-free version of OSNews. That’s a long and slow process, especially with the current economic uncertainty making people reconsider their expenses. Disabling our ads altogether for everyone once we’re fully reader-funded is still my end goal, but until the world around us settles down a bit, that’s a little while off. If you want to speed this process up – you can become an OSNews Patreon and enjoy an ad-free OSNews today.

Silicon Valley developers need to unionise

I don’t know anything about hiring processes in Silicon Valley, or about hiring processes in general since I’ve always worked for myself (and still do, running OSNews, relying on your generous Patreon and Ko-Fi support), so when I ran into this horror story of applying for a position at a Silicon Valley startup, I was horrified. Apparently it’s not unheard of – it might even be common? – to ask applicants for a coding position to develop a complex application, for free, without much guidance beyond some vague, generic instructions? In this case, the applicant, Jose Vargas, was applying for a position at Kagi, the search startup with the, shall we say, somewhat evangelical fanbase. After applying, he was asked to develop a complete e-mail client, either as a TUI/CLI or a web application that can view and send emails, using a fake or a real backend, which can display at least plaintext e-mails. None of this was going to be paid labour, of course. Vargas started out by sending in a detailed proposal of what he was planning to create, ending with the obvious question what kind of response he’d get if he actually implemented the detailed proposal. He got a generic response in return, without an answer to that question, but he set out to work regardless. In the end, it took him about a week to complete the project and send it in. He eventually received a canned rejection notice in response, and after asking for clarification the hiring manager told him they wanted something “simpler and stronger”, so he didn’t make the cut. I’m not interested in debating whether or not Vargas was suited for the position, or if the unpaid work he sent in was any good. What I do want to talk about, though, is the insane amount of unpaid labour applicants are apparently asked to do in Silicon Valley, the utter lack of clear and detailed instructions, and how the hiring manager didn’t answer the question Vargas sent in alongside his detailed proposal. After all, the hiring manager could’ve saved everyone a ton of time by letting Vargas know upfront the proposal wasn’t what Kagi was looking for. Everything about this feels completely asinine to me. As a (former) translator, I’m no stranger to having to do some work to give a potential client an idea of what my work looks like, but more than half a page of text to translate was incredibly rare. Only on a few rare occasions did a prospective client want me to translate more than that, and in those cases it was always as paid labour, at the normal, regular rate. For context, half a page of text is less than half an hour of work – a far cry from a week’s worth of unpaid labour. I’ve read a ton of online discourse about this particular story, and there’s no clear consensus on whether or not Vargas’ feelings are justified. Personally, I find the instructions given by Kagi overly broad and vague, the task of creating an email client to be overly demanding, and the canned (“AI”?) responses by the hiring manager insulting – after sending in such a detailed proposal, it should’ve been easy for a halfway decent hiring manager to realise Vargas might not be a good fit for the role, and tell him so before he started doing any work. Kagi is fully within its right to determine who is and is not a good fit for the company, and who they hire is entirely up to them. If such stringent, demanding hiring practices are par for the course in Silicon Valley, I also can’t really fault them for toeing the industry line. The hiring manager’s behaviour seems problematic, but everyone makes mistakes and nobody’s perfect. In short, I’m not even really mad at Kagi specifically here. However, if such hiring practices are indeed the norm, can I, as an outsider, just state the obvious? What on earth are you people doing to each other over there in Silicon Valley? Is this really how you want to treat potential applicants, and how you, yourself, want to be treated? Imagine if a someone applies to be a retail clerk at a local supermarket, and the supermarket’s hiring manager asks the applicant to work an entire week in the store, stocking shelves and helping shoppers, without paying the person any wages, only to deny their application after the week of free labour is over? You all realise how insane that sounds, right? Why not look at a person’s previous work, hosted on GitHub or any of its alternatives? Why not contact their previous employers and ask about their performance there, as happens in so many other industries? Why, instead of asking someone to craft an entire email client, don’t you just give them a few interesting bugs to look at that won’t take an entire week of work? Why not, you know, pay for their labour if you demand a week’s worth of work? I’m so utterly baffled by all of this. Y’all developers need a union.

EU fines TikTok token amount of €530 million for gross privacy violations

A European Union privacy watchdog fined TikTok 530 million euros ($600 million) on Friday after a four-year investigation found that the video sharing app’s data transfers to China put users at risk of spying, in breach of strict EU data privacy rules. Ireland’s Data Protection Commission also sanctioned TikTok for not being transparent with users about where their personal data was being sent and ordered the company to comply with the rules within six months. ↫ Kelvin Chan for AP News In case you’re wondering what Ireland’s specific role in this case is, TikTok’s European headquarters are located in Ireland, which means that any EU-wide privacy violations by TikTok are handled by Ireland’s privacy watchdog. Anyway, sounds like a big fine, right? Let’s do some math. TikTok’s global revenue last year is estimated at €20 billion. This means that a €530 million fine is 2.65% of TikTok’s global yearly revenue. Now let’s make this more relatable for us normal people. The yearly median income in Sweden is €34365 (pre-taxes), which means that if the median income Swede had to pay a fine with the same impact as the TikTok fine, they’d have to pay €910. That’s how utterly bullshit this fine is. €910 isn’t nothing if you make €34000 per year, but would you call this a true punishment for TikTok? Any time you read about any of these coporate fines, you should do math like this to get an idea of what the true impact of the fine really amounts to. You’ll be surprised to learn to just how utterly toothless they are.

Apple fined for €500 million by EC, Facebook for €200 million

The European Commission has levied fines against both Apple and Facebook for violating the Digital Markets Act. Apple has to pay a €500 million fine, and Facebook a €200 million fine. Apple is breaking EU law by not allowing application developers to inform users of other offers outside the App Store. The Commission found that Apple fails to comply with this obligation. Due to a number of restrictions imposed by Apple, app developers cannot fully benefit from the advantages of alternative distribution channels outside the App Store. Similarly, consumers cannot fully benefit from alternative and cheaper offers as Apple prevents app developers from directly informing consumers of such offers. The company has failed to demonstrate that these restrictions are objectively necessary and proportionate. ↫ European Commission press release Not only is Apple ordered to pay the €500 million fine, they also have to remove any and all of the illegal restrictions they put in place. Facebook, meanwhile, was fined for not offering an equally functional services but without combining user data from different services. The company did offer a choice between paying and not paying – whereby the latter involved data collection and combination – but this model violated the DMA. The Commission found that this model is not compliant with the DMA, as it did not give users the required specific choice to opt for a service that uses less of their personal data but is otherwise equivalent to the ‘personalised ads’ service. Meta’s model also did not allow users to exercise their right to freely consent to the combination of their personal data. ↫ European Commission press release Facebook did later amend their model to make it compliant with the DMA, and so the fine only covers the few months Facebook was violating EU law. Fun additional note: the EC also mentions that the Facebook Marketplace is no longer a gatekeeper service under the DMA, since its user numbers has dropped below the threshold. Facebook seems to be having some engagement issues in Europe, and you love to hear it. Both companies are required to pay and comply within 60 days, or further periodic penalty payments will be levied.

Google must crack open Android for third-party stores, rules Epic judge

Late last year, Google’s Play Store was ruled to be a monopoly in the US, and today the judge in that case has set out what Google must do to address this situation. Today, Judge James Donato issued his final ruling in Epic v. Google, ordering Google to effectively open up the Google Play app store to competition for three whole years. Google will have to distribute rival third-party app stores within Google Play, and it must give rival third-party app stores access to the full catalog of Google Play apps, unless developers opt out individually. ↫ Sean Hollister at The Verge On top of these rather big changes, Google also cannot mandate the use of Google’s own billing solution, nor can it prohibit developers from informing users of other ways to download and/or pay for an application. Furthermore, Google can’t make sweetheart deals with device makers to entice them to install the Play Store or to block them from installing other stores, and Google can’t pay developers to only use the Play Store or not use other stores. It’s a rather comprehensive set of remedies that will remain in force for three years. Many of these remedies are taken straight from the European Union’s Digital Markets Act, but they will be far less effective since they’re only applied to one company, and only for three years. On top of that, Google can appeal, and the company has already stated that it’s going to ask for an immediate stay on these remedies, and if they get that stay, the remedies won’t have to be implemented any time soon. This legal tussling is far from over, and does very little to protect consumer choice. A clear law that simply prohibits this kind of market abuse, like the DMA, is much fairer to everyone involved, and creates a consistent level playing field for everyone, instead of only affecting random companies based on the whims of something as unpredictable as juries. In other words, I don’t think much is going to change in the United States after this ruling, and we’ll likely be hearing more back and forths in the court room for years to come, all while US consumers are being harmed. It’s better than nothing in lieu of a working Congress actually doing, well, anything, but that’s not saying much.

California’s new law forces digital stores to admit you’re just licensing content, not buying it

California Governor Gavin Newsom has signed a law (AB 2426) to combat “disappearing” purchases of digital games, movies, music, and ebooks. The legislation will force digital storefronts to tell customers they’re just getting a license to use the digital media, rather than suggesting they actually own it. When the law comes into effect next year, it will ban digital storefronts from using terms like “buy” or “purchase,” unless they inform customers that they’re not getting unrestricted access to whatever they’re buying. Storefronts will have to tell customers they’re getting a license that can be revoked as well as provide a list of all the restrictions that come along with it. Companies that break the rule could be fined for false advertising. ↫ Emma Roth at The Verge A step in the right direction, but a lot more is definitely needed. This law in particular seems to leave a lot of wiggle room for companies to keep using the “purchase” term while hiding the disclosure somewhere in the very, very small fine print. I would much rather a law like this just straight up ban the use of the term “purchase” and similar terms when all you’re getting is a license. Why allow them to keep lying about the nature of the transaction in exchange for some fine print somewhere? The software industry in particular has been enjoying a free ride when it comes to consumer protection laws, and the kind of malpractice, lack of accountability, and laughable quality control would have any other industry shut down in weeks for severe negligence. We’re taking baby steps, but it seems we’re finally arriving at a point where basic consumer protection laws and rights are being applied to software, too. Several decades too late, but at least it’s something.

The Internet Archive just lost its appeal over ebook lending

The Internet Archive has lost its appeal in a fight to lend out scanned ebooks without the approval of publishers. In a decision on Wednesday, the Second Circuit Court of Appeals ruled that permitting the Internet Archive’s digital library would “allow for widescale copying that deprives creators of compensation and diminishes the incentive to produce new works.” The decision is another blow to the nonprofit in the Hachette v. Internet Archive case. In 2020, four major publishers — Hachette, Penguin Random House, Wiley, and HarperCollins — sued the Internet Archive over claims its digital library constitutes “willful digital piracy on an industrial scale.” ↫ Emma Roth If you’re a library and scan books and offer a lending service, you’re committing “willful digital piracy on an industrial scale”. If you scan the entire goddamn internet without any regard for licensing or copyright and regurgitate chunks of it on command, you’re a visionary, a revolutionary, a genius. Make it make sense.

Apple helped nix part of a child safety bill. More fights are expected.

Kim Carver, a legislator in the US state of Louisiana, added a provision to a child safety bill forcing Apple and Google to enforce age restrictions on downloads in their application stores. In other words, it would force Apple to make sure minors could not download gambling and casino applications – i.e., 99% of mobile games – that make up the vast majority of Apple’s services revenue. It would also make application stores play a role in enforcing age restrictions on social media applications, which makes sense because Apple and Google know the age of every one of their users. Well, it turns out Apple was not happy. They sent out an absolute army of lobbyists – including a guy known for lobbying on behalf of truck-stop casinos, in case you were wondering about the type of people Apple uses for lobbying – to kill this specific provision. Carver’s provision would have breezed through the Louisiana senate, but it needed a key committee approval before being put up for a vote. And it’s this committee that Apple started heavily influencing and pressuring. Carver began hearing rumblings that Apple was making inroads with the committee—his amended bill might be in trouble. Uncertain on how to proceed, he approached the chairwoman of the committee, Sen. Beth Mizell, for advice. He declined to describe the substance of the conversation to The Wall Street Journal, but in the end, he promised not to object if she removed the app store provisions or support restoring them on the Senate floor. “I made the choice to take the win that we could get,” Carver said. ↫ Jeff Horwitz and Aaron Tilley at The Wall Street Journal This is not the first time Apple has pressured legislatures to drop bills it didn’t like. A famous case is the state if Georgia, which intended to pass a number of application store bills to open up the App Store in much the same way the European Union did with the DMA. Apple went absolutely mental in Georgia, including threatening to cancel “a $25 million investment in a historically Black college in Atlanta”. Apple won. The way these sleazebag companies get away with such blatant corruption is by using third-party lobbyists, which technically are not employed by the companies in question, so no matter how low and sleazy these lobbyists go, the companies they lobby for can wash their hands in innocence and absolve themselves from any responsibility for the various financial and legal threats levied at underfunded, understaffed local legislatures. Spending a few millions on a local development project or whatever is peanuts for Apple, but a massive boon for a small community somewhere, so Apple pulling out means nothing to Apple, but would massively affect such a community. It’s not surprising local legislatures fold. Circling back to the age restriction provision itself – telling stores what they can and cannot sell is an entirely normal thing to do, and happens all the time all over the world. It’s why in, say, The Netherlands, supermarkets are only allowed to sell “light” alcohol like beer and wine, with hard alcohol moved to separate liquor stores that have to be separate from the supermarket, so age restrictions are easier to enforce. There’s also just an infinite number of things you’re just not allowed to sell, period. As always, Silicon Valley believes it’s a very special snowflake to whom regular, normal, widely accepted rules do not apply. Why shouldn’t a store selling gambling applications and similarly addictive and damaging applications have to do the absolute bare minimum to protect minors? Imagine the massive outcry if a Costco or Walmart was found to sell massive amounts of hard liquor to children – why should Silicon Valley companies be treated any differently?

Here’s 22 examples of Google employees trying to avoid creating evidence for court

In its antitrust case against Google, the Federal Government filed a list of chats it had obtained that show Google employees explicitly asking each other to turn off a chat history feature to discuss sensitive subjects, showing repeatedly that Google workers understood they should try to avoid creating a paper trail of some of their activities.  The filing came following a hearing in which judge Leonie Brinkema ripped Google for “destroyed” evidence while considering a filing from the Department of Justice asking the court to find “adverse interference” against Google, which would allow the court to assume it purposefully destroyed evidence. Previous filings, including in the Epic Games v Google lawsuit and this current antitrust case, have also shown Google employees purposefully turning history off. ↫ Seamus Hughes The fact that corporations break the law, and lie, cheat, and scam their way to the top is not something particularly shocking, nor will it surprise anyone. I can barely even get angry about it anymore – birds gotta eat, fish gotta swim, corpos gotta break the law, that sort of thing. It’s just an inevitability of reality, a law of nature. You know it, I know it, the whole world knows it. No, what really upsets me is just how easily they get away with it, and even if they do get punished, any fines or other forms of punishment are so utterly disproportionately mild compared to the crimes committed. It’s incredibly rare for anyone responsible for corporate crime to ever face any serious punishment, let alone jail time, and even in the rare cases where they do, they usually have some stock options or whatever left over from their employment contract that will ensure a lavishly wealthy lifestyle. Fines levied against corporations as a whole are usually so low they’re just a minor cost of doing business, to the point where one has to wonder why they’re even being levied at all. Compare this to us normal folks, and the differences couldn’t be more stark. Whenever we’re accidentally late on some small bill, we get fined automatically, with very little recourse. We get a speeding ticket automatically in the mail because we drove 5 km/h over the speed limit. Our tax agencies are stupidly effective and efficient at screwing you over for that small side hustle selling crap on eBay. And rarely do we have any effective, efficient recourse. And these things can quickly spiral out of control when you’re already living paycheck to paycheck – being poor is really, really expensive. And let’s not even get into how much worse any of this is if you’re part of a minority, like being black in the US, or of North-African descent in Europe. In this case, the illegal activities of Google and its executies and employees is on such clear display, and yet, few, if any, will suffer any consequences for them. If you ever wonder why so many regular people flock to political extremes, it’s exactly this kind of deep unfairness and inequality that lies at its roots. It’s dispiriting, demoralising, and disheartening, and primes the pumps for disenfranchisement with society, and thus the search for alternatives, upon which extremists pray. We either stop our continual slide into corporatism, or our societies will fall.

Popular AI “nudify” sites sued amid shocking rise in victims globally

San Francisco’s city attorney David Chiu is suing to shut down 16 of the most popular websites and apps allowing users to “nudify” or “undress” photos of mostly women and girls who have been increasingly harassed and exploited by bad actors online. These sites, Chiu’s suit claimed, are “intentionally” designed to “create fake, nude images of women and girls without their consent,” boasting that any users can upload any photo to “see anyone naked” by using tech that realistically swaps the faces of real victims onto AI-generated explicit images. ↫ Ashley Belanger at Ars Technica This is an incredibly uncomfortable topic to talk about, but with the advent of ML and AI making it so incredibly easy to do this, it’s only going to get more popular. The ease with which you can generate a fake nude image of someone is completely and utterly out of whack with the permanent damage it can do the person involved – infinitely so when it involves minors, of course – and with these technologies getting better by the day, it’s only going to get worse. So, how do you deal with this? I have no idea. I don’t think anyone has any idea. I’m pretty sure all of us would like to just have a magic ban button to remove this filth from the web, but we know such buttons don’t exist, and trying to blast this nonsense out of existence is a game of digital whack-a-mole where there are millions of moles and only one tiny hammer that explodes after one use. It’s just not going to work. The best we can hope for is to get a few of the people responsible behind bars to send a message and create some deterrent effect, but how much that would help is debatable, at best. As a side note, I don’t want to hang this up on AI and ML alone. People – men – were doing this to to other people – women – even before the current crop of AI and ML tools, using Photoshop and similar tools, but of course it takes a lot more work to do it manually. I don’t think we should focus too much on the role ML and AI plays, and focus more on finding real solutions – no matter how hard, or impossible, that’s going to be.

European Commission shoots down Facebook’s “pay or consent” model

The European Union’s Digital Markets Act is the gift that keeps on giving. This time, it’s Facebook’s turn to be slapped on the fingers with a ruler – a metric ruler, of course – because of its malicious compliance with the DMA. Today, the Commission has informed Meta of its preliminary findings that its “pay or consent” advertising model fails to comply with the Digital Markets Act (DMA). In the Commission’s preliminary view, this binary choice forces users to consent to the combination of their personal data and fails to provide them a less personalised but equivalent version of Meta’s social networks. ↫ European Commission press release The European Commission’s preliminary conclusion takes issue with Facebook’s binary choice between “pay for zero ads” and “full-on tracking and all the ads”. According to the DMA, Facebook must offer users the option of an equivalent experience with less tracking, and the company doesn’t offer such an option to users. In addition, Facebook’s proposal does not allow users to “exercise their right to freely consent to the combination of their personal data”. It’s important to note that this is not some sort of definitive ruling of finding; it’s preliminary, and Facebook now has the opportunity to state its case and formulate its arguments. If the eventual ruling is that Facebook does not comply, the company is liable for fines up to 10% of its yearly worldwide turnover, which can rise up to 20% for repeated infractions.

Adobe’s hidden cancellation fee is unlawful, FTC suit says

To lock subscribers into recurring monthly payments, Adobe would typically pre-select by default its most popular “annual paid monthly” plan, the FTC alleged. That subscription option locked users into an annual plan despite paying month to month. If they canceled after a two-week period, they’d owe Adobe an early termination fee (ETF) that costs 50 percent of their remaining annual subscription. The “material terms” of this fee are hidden during enrollment, the FTC claimed, only appearing in “disclosures that are designed to go unnoticed and that most consumers never see.” ↫ Ashley Belanger at Ars Technica There’s a sucker for every corporation, but I highly doubt there’s anyone out there who would consider this a fair business practice. This is so obviously designed to hide costs during sign-up, and then unveil them when the user considers quitting. If this is deemed legal or allowed, you can expect everyone to jump on this bandwagon to scam users out of their money. It goes further than this, though. According to the FTC, Adobe knew this practice was shady, but continued it anyway because altering it would negatively affect the bottom line. The FTC is actually targeting two Adobe executives directly, which is always nice to hear – it’s usually management that pushes such illegal practices through, leaving the lower ranks little choice but to comply or lose their job. Stuff like this is exactly why confidence in the major technology companies is at an all-time low.

Meta halts plans to train machine learning on Facebook, Instagram posts in EU

It seems that if you want to steer clear from having Facebook use your Facebook, WhatsApp, Instagram, etc. data for machine learning training, you might want to consider moving to the European Union. Meta has apparently paused plans to process mounds of user data to bring new AI experiences to Europe. The decision comes after data regulators rebuffed the tech giant’s claims that it had “legitimate interests” in processing European Union- and European Economic Area (EEA)-based Facebook and Instagram users’ data—including personal posts and pictures—to train future AI tools. ↫ Ashley Belanger These are just the opening salvos of the legal war that’s brewing here, so who knows how it’s going to turn out. For now, though, European Union Facebook users are safe from Facebook’s machine learning training.

Apple set to be first big tech group to face charges under EU digital law

Brussels is set to charge Apple over allegedly stifling competition on its mobile app store, the first time EU regulators have used new digital rules to target a Big Tech group. The European Commission has determined that the iPhone maker is not complying with obligations to allow app developers to “steer” users to offers outside its App Store without imposing fees on them, according to three people with close knowledge of its investigation. ↫ Javier Espinoza and Michael Acton This was always going to happen for as long as Apple’s malicious compliance kept dragging on. The rules in the Digital Markets Act are quite clear and simple, and despite the kind of close cooperation with EU lawmakers no normal EU citizen is ever going to get, Apple has been breaking this law from day one without any intent to comply. European Union regulators have given Apple far, far more leeway and assistance than any regular citizen of small business would get, and that has to stop. The possible fines under the DMA are massive. If Apple is found guilty, they could be fined for up to 10% of its global revenue, or 20% for repeated violations. This is no laughing matters, and this is not one of those cases where a company like Apple could calculate fines as a mere cost of doing business – this would have a material impact on the company’s numbers, and shareholders are definitely not going to like it if Apple gets fined such percentages. As these are preliminary findings, Apple could still implement changes, but if past behaviour is any indication, any possibly changes will just be ever more malicious compliance.

Arm, Qualcomm legal battle seen disrupting AI-powered PC wave

The new Windows on ARM Copilot+ PC thing, running on Qualcomm’s Snapdragon X Elite and Pro chips, isn’t even out the door yet, and we’re already dealing with legal proceedings. But the main conversation among conference attendees was over how a contract dispute between Arm Holdings and Qualcomm, which work together to make the chips powering these new laptops, could abruptly halt the shipment of new PCs that industry leaders expect will make Microsoft and its partners billions of dollars. ↫ Max A. Cherney at Reuters The basic gist of the story is as follows. Qualcomm acquired a company named Nuvia, founded by former Apple processor engineers, in order to gain new technology to build its Snapdragon X Elite and Pro chips. Nuvia was planning on developing ARM chips for servers, but after the acquisition, Qualcomm changed their plans and repurposed their technology for use in laptops – the new X chips. ARM claims that Nuvia was only granted a license for server use, and not laptop use. Qualcomm, meanwhile, argued that it has a broad license to use ARM for pretty much anything, and as such, that any possible restrictions Nuvia had are irrelevant. While this all sounds like very rich corporations having a silly legal slapfight, it could have real consequences. If the legal case goes very, very wrong for Qualcomm, it could halt the sale of devices powered by the Snapdragon X chips well before they’re even shipping. I doubt it’ll get that far – it rarely does, and there’s some big names and big reputations at play here – but it does highlight the absurdity of how the ARM ecosystem works. Speaking of the ARM ecosystem, Qualcomm isn’t the only ARM chip makers dying to break into the PC market. Qualcomm currently has a weird exclusivity agreement with Microsoft where it’s the only ARM chip supplier for PCs, but that agreement is running out soon. Another player that’s ready to storm this market once that happens is MediaTek, who is also developing a chip geared towards Microsoft’s Copilot+ specifications, with a release target of 2025. Let’s hope MediaTek will be as forthcoming with Linux support as Qualcomm surprisingly has been, but I have my sincerest doubt.

Adobe terms clarified: will never own your work, or use it for AI training

Adobe Creative Cloud users opened their apps yesterday to find that they were forced to agree to new terms, which included some frightening-sounding language. It seemed to suggest Adobe was claiming rights over their work. Worse, there was no way to continue using the apps, to request support to clarify the terms, or even uninstall the apps, without agreeing to the terms. ↫ Ben Lovejoy at 9To5Mac Of course users were going to revolt. Even without the scary-sounding language, locking people out of their applications unless they agree to new terms is a terrible dark pattern, and something a lot of enterprise customers certainly aren’t going to be particularly happy about. I’ve never worked an office job, so how does stuff like this normally go? I’m assuming employees aren’t allowed to just accept new licensing terms from Adobe or whatever on their office computers? In response to the backlash, Adobe came out and said in a statement that it does not intend to claim ownership over anyone’s work, and that it’s not going to train its ML models on customers’ work either. The company states that to train its Firefly ML model, it only uses content it has properly licensed for it, as well as public domain content. Assuming Adobe is telling the truth, it seems the company at least understands the concept of consent, which is good news, and a breath of fresh air compared to crooks like OpenAI or GitHub. Content used for training ML models should be properly licensed for it, and consent should be properly obtained from rightsholders, and taking Adobe at their word, it seems that’s exactly what they’re doing. Regardless, the backlash illustrates once again just how – rightfully – weary people are of machine learning, and how their works might be illegally appropriated to train such models.

US agencies to probe AI dominance of Nvidia, Microsoft, and OpenAI

The US Justice Department and Federal Trade Commission reportedly plan investigations into whether Nvidia, Microsoft, and OpenAI are snuffing out competition in artificial intelligence technology. The agencies struck a deal on how to divide up the investigations, The New York Times reported yesterday. Under this deal, the Justice Department will take the lead role in investigating Nvidia’s behavior while the FTC will take the lead in investigating Microsoft and OpenAI. ↫ Jon Brodkin at Ars Technica Even if there’s no findings of wrongdoing, these kinds of investigations are incredibly important, if only to let the megaocorporations know we’ve got our eyes on them. Artificial intelligence is a whole new world of potential monopolistic and other forms of abuse, and I’d like the various competition authorities to be on top of it right from the beginning for once, so we don’t end up with a fait accompli like we have in so many other parts of the technology sector.

EU data protection board says ChatGPT still not meeting data accuracy standards

OpenAI’s efforts to produce less factually false output from its ChatGPT chatbot are not enough to ensure full compliance with European Union data rules, a task force at the EU’s privacy watchdog said. “Although the measures taken in order to comply with the transparency principle are beneficial to avoid misinterpretation of the output of ChatGPT, they are not sufficient to comply with the data accuracy principle,” the task force said in a report released on its website on Friday. ↫ Tassilo Hummel at Reuters I’m glad at least some authorities are taking the wildly inaccurate nonsense outputs of many “AI” tools seriously. I’m not entirely sure when a tool like ChatGPT can be considered “accurate”, but whatever it is now, is not it.

Nintendo issues DMCA takedown notice against over 8,500 Yuzu emulator repositories

The notice was filed on developer platform GitHub, which Nintendo claimed housed repositories that “offer and provide access to the Yuzu emulator or code based on ” which “illegally circumvents Nintendo’s technological protection measures and runs illegal copies of Switch games.” GitHub said it contacted the owners of the repositories to provide an “opportunity to make changes” before taking down the repositories, in addition to providing legal resources and information on how to file counter notices. ↫ Sophie McEvoy at GamesIndustry.biz The legal troubles around Yuzu are a little nebulous to deal with, as there’s a lot of chatter online that Yuzu contains, or at least used, code from leaked Switch SDKs. If that is indeed true – I haven’t seen any definitive proof yet – then it makes Nintendo’s aggressiveness a lot more understandable, even for someone like me who believes emulation should be 100% legal and accessible.

US Senate passes TikTok ban bill

A bill that would force China-based company ByteDance to sell TikTok — or else face a US ban of the platform — is all but certain to become law after the Senate passed a foreign aid package including the measure. It now heads to President Joe Biden, who already committed to signing the TikTok legislation should it make it through both chambers of Congress. The House passed the foreign aid package that includes the TikTok bill on Saturday. ↫ Lauren Feiner at The Verge I hope the EU follows.
OSZAR »