label
stringclasses
2 values
text
stringlengths
31
724k
__index_level_0__
float64
5
14.2k
GOOD
AdGuard Home: Network-wide ads and trackers blocking DNS server (github.com/adguardteam) Network-wide ads & trackers blocking DNS server Use Git or checkout with SVN using the web URL. Work fast with our official CLI. Learn more about the CLI . Please sign in to use Codespaces. If nothing happens download GitHub Desktop and try again. If nothing happens download GitHub Desktop and try again. If nothing happens download Xcode and try again. Your codespace will open once ready. There was a problem preparing your codespace please try again. Free and open source powerful network-wide ads & trackers blocking DNS server. AdGuard.com | Wiki | Reddit | Twitter | Telegram AdGuard Home is a network-wide software for blocking ads and tracking. After you set it up it'll cover ALL your home devices and you don't need any client-side software for that. It operates as a DNS server that re-routes tracking domains to a black hole thus preventing your devices from connecting to those servers. It's based on software we use for our public AdGuard DNS servers and both share a lot of code. To install with curl run the following command: To install with wget run the following command: To install with fetch run the following command: The script also accepts some options: Note that options -r and -u are mutually exclusive. Please read the Getting Started article on our Wiki to learn how to install AdGuard Home manually and how to configure your devices to use it. You can use our official Docker image on Docker Hub . If you're running Linux there's a secure and easy way to install AdGuard Home: get it from the Snap Store . See our Wiki . If you want to integrate with AdGuard Home you can use our REST API . Alternatively you can use this python client which is used to build the AdGuard Home Hass.io Add-on . Running your own AdGuard Home server allows you to do much more than using a public DNS server. It's a completely different level. See for yourself: Choose what exactly the server blocks and permits. Monitor your network activity. Add your own custom filtering rules. Most importantly it's your own server and you are the only one who's in control. At this point AdGuard Home has a lot in common with Pi-Hole. Both block ads and trackers using the so-called DNS sinkholing method and both allow customizing what's blocked. AdGuard Home provides a lot of features out-of-the-box with no need to install and configure additional software. We want it to be simple to the point when even casual users can set it up with minimal effort. Disclaimer: some of the listed features can be added to Pi-Hole by installing additional software or by manually using SSH terminal and reconfiguring one of the utilities Pi-Hole consists of. However in our opinion this cannot be legitimately counted as a Pi-Hole's feature. It depends. DNS sinkholing is capable of blocking a big percentage of ads but it lacks the flexibility and the power of traditional ad blockers. You can get a good impression about the difference between these methods by reading this article which compares AdGuard for Android (a traditional ad blocker) to hosts-level ad blockers (which are almost identical to DNS-based blockers in their capabilities). This level of protection is enough for some users. Additionally using a DNS-based blocker can help to block ads tracking and analytics requests on other types of devices such as SmartTVs smart speakers or other kinds of IoT devices (on which you can't install traditional ad blockers). Here are some examples of what cannot be blocked by a DNS-level blocker: YouTube Twitch ads; Facebook Twitter Instagram sponsored posts. Essentially any advertising that shares a domain with content cannot be blocked by a DNS-level blocker. Is there a chance to handle this in the future? DNS will never be enough to do this. Our only option is to use a content blocking proxy like what we do in the standalone AdGuard applications. We're going to bring this feature support to AdGuard Home in the future. Unfortunately even in this case there still will be cases when this won't be enough or would require quite a complicated configuration. Run make init to prepare the development environment. You will need this to build AdGuard Home: Open your terminal and execute these commands: NOTE: The non-standard -j flag is currently not supported so building with make -j 4 or setting your MAKEFLAGS to include for example -j 4 is likely to break the build. If you do have your MAKEFLAGS set to that and you don't want to change it you can override it by running make -j 1 . Check the Makefile to learn about other commands. You can build AdGuard Home for any OS/ARCH that Go supports. In order to do this specify GOOS and GOARCH environment variables as macros when running make . For example: or: You'll need snapcraft to prepare a release build. Once installed run the following command: See the build-release target documentation . Run make build-docker to build the Docker image locally (the one that we publish to DockerHub). Please note that we're using Docker Buildx to build our official image. You may need to prepare before using these builds: (Linux-only) Install Qemu: Prepare the builder: See the build-docker target documentation . When you need to debug the frontend without recompiling the production version every time for example to check how your labels would look on a form you can run the frontend build a development environment. In a separate terminal run: Run your AdGuardHome binary with the --local-frontend flag which instructs AdGuard Home to ignore the built-in frontend files and use those from the ./build/ directory. Now any changes you make in the ./client/ directory should be recompiled and become available on the web UI. Make sure that you disable the browser cache to make sure that you actually get the recompiled version. You are welcome to fork this repository make your changes and submit a pull request . Please make sure you follow our code guidelines though. Please note that we don't expect people to contribute to both UI and backend parts of the program simultaneously. Ideally the backend part is implemented first i.e. configuration API and the functionality itself. The UI part can be implemented later in a different pull request by a different person. There are two update channels that you can use: beta : beta versions of AdGuard Home. More or less stable versions usually released every two weeks or more often. edge : the newest version of AdGuard Home from the development branch. New updates are pushed to this channel daily. There are three options how you can install an unstable version: Snap Store : look for the beta and edge channels. Docker Hub : look for the beta and edge tags. Standalone builds. Use the automated installation script or look for the available builds on the Wiki . Script to install a beta version: Script to install an edge version: If you run into any problem or have a suggestion head to this page and click on the New issue button. If you want to help with AdGuard Home translations please learn more about translating AdGuard products in our Knowledge Base . You can contribute to the AdGuardHome project on CrowdIn . Another way you can contribute is by looking for issues marked as help wanted asking if the issue is up for grabs and sending a PR fixing the bug or implementing the feature. AdGuard Home Remote : iOS app by Joost . Python library by @frenck . Home Assistant add-on by @frenck . OpenWrt LUCI app by @kongfl888 (originally by @rufengsuixing ). Prometheus exporter for AdGuard Home by @ebrianne . AdGuard Home on GLInet routers by Gl-Inet . Cloudron app by @gramakri . Asuswrt-Merlin-AdGuardHome-Installer by @jumpsmm7 aka @SomeWhereOverTheRainBow . Node.js library by @Andrea055 . This software wouldn't have been possible without: You might have seen that CoreDNS was mentioned here before but we've stopped using it in AdGuard Home. For the full list of all Node.js packages in use please take a look at client/package.json file. Our main idea is that you are the one who should be in control of your data. So it is only natural that AdGuard Home does not collect any usage statistics and does not use any web services unless you configure it to do so. See also the full privacy policy with every bit that could in theory be sent by AdGuard Home is available. Network-wide ads & trackers blocking DNS server
650
BAD
Adding capacity to the electricity grid is not a simple task (economist.com) C hill sun solar a facility in Nevada with a generating capacity of 2.25 GW should produce enough electricity to meet 1.7% of Californias annual demand. The 4.1 GW Berwick Bank wind farm off the east coast of Scotland will provide more energy over a year than could Britains two largest gas-fired power stations running full tilt. But none of this will matter until these renewable heavyweights get hooked up to their respective grids. Your browser does not support the <audio> element. The grids used by developed countries are not accustomed to rapid change. At the turn of the century a couple of power plants a year might be connected to meet new demand driven by demographic change to replace plants at the end of their lives or as in shifts between coal and gas to compete on price. But the overall rate was typically slow with net capacity changing little and new plants often using the same connections as old ones. Unsurprisingly the business of supplying the highly specialised components big grids require was paced to match. Take the transformers needed to step electricity up to and down from the highest voltages on the gridmassive machines made with bespoke techniques. Gleaming fettuccine strands of copper bundled into sapling-thick paper-sheathed cable are wound tightly around bus-sized wooden frames by patient hands. Such windings perfect to the millimetre and precisely tailored to the buyers requirements are then brought together in huge structures of wood and glueany metal would disrupt the electromagnetic fields around the copper. The whole assembly is then submerged in a tank filled with mineral oil which rises all the way to its brim. To see one of these titans being made at the Siemens Energy factory in Nuremberg Germany is to witness a remarkable mixture of scale strangeness and craft skills (Siemens attempts to do the winding by robot have so far failed). The finished products typically weigh 300 tonnes cost 30m ($32m) and are ready for delivery about three years after being ordered. Expanding the worlds capacity to make such exotica is vital if grids are to expand quickly. But it is not easily done. There are few providers and they are cautious. Siemens is working on making its existing plants more productive as are its competitors but it is reluctant to build lots of new capacity for what it sees as a one-off spike in demand. So are its competitors. The expectation that lots of new equipment will never be needed very quickly is one the legacies of steady-state grids. Another linked legacy is a low appetite for investment. For most of the past half century the governments which owned many national grids and the bodies which regulated grids in the private sector focused on two main goals. The first was that grids be stable and secure; that the lights be kept on and industry humming. The second was that grids should add as little as possible to the price of the electricity they supplied. For the most part the grids met these goals. Consumers in most of the rich world can rely on the fact that except under the most adverse of circumstances the flick of a switch will illuminate their homes and the food in their freezers will not thaw. This certainty has been provided quite cheaply. Ben Wilson of National Grid which owns and runs Britains transmission infrastructure says that in 2019 the cost of the grid accounted for just 20 ($25) of the average consumers electricity bill of 1300 a mere 1.5%. But conservative regulation offered no incentive or scope for strategic investment. Grids could spend only what was needed to provide connections for projects which were guaranteed to get built or to alleviate chronic congestion. Connection requests were typically dealt with on a first-come-first-served basis to make things fair rather than by some other mechanism such as auction or a strategic plan. That is not good enough for a world which wants to replace the vast majority of the plants currently generating its electricity and which will often need to build the replacements in different places. An inability to provide enough grid connections means that all around the world new generating projects face lengthening queues before they can get onto the grid. Those queues do not just slow the rate at which new capacity becomes available. They also add to uncertainty and thus increase the cost of investment. Zoisa North-Bond chief executive of Octopus Renewables a British energy firm has called grid access the biggest limiting factor to increasing the market share of renewables. In 2022 about 54% of Britains electricity was generated without emitting carbon; its grid is one of the most deeply decarbonised in the world. But that does not mean it has solved the queue problem. On average the projects connected to the grid in 2022 were doing so about four years after the date they had requested. The company developing Berwick Bank SSE should have all the wind farms capacity installed by 2030but at least one part of the farm is due to connect to the grid only at the end of the following year according to published data. A supplier asking for a grid connection today can expect to be offered one for some time between 2030 and 2038. There is a real limit to how fast connections can be provided. Expanding the grid changes its properties and thus risks its stability. But todays delays are much longer than necessary. Mr Wilson of National Grid says the queue has grown to its inordinate length because until very recently Britains grid regulator Ofgem allowed no investments in upgrades before a connection request had been signed on the dotted line. It might be obvious that a country betting big on offshore wind as a new source of power would need strong grid connections to the North Sea and the Atlantic. But they could be invested in only once National Grid had specific and detailed requests from individual developers. As well as slowing investment this sort of bottleneck also increases permitting time. Only after the generators planning and permitting is completed can the equivalent process for the grid connection start. In engineering terms what could be done in parallel is being done in series. But perhaps the biggest cause of queue length is speculation. In 2008 in an effort to incentivise the connection of small-scale renewables National Grid made applying for a connection much easier. That made it possible for speculators to file paper projects in the hope of flipping their spot in the queue to a project behind them. Such projects may make up as much as 80% of the queue; many have no designs prepared nor engineers on staff that might carry them out. Weeding them out would seem an obvious step. But the regulatory regime does not give National Grid the tools with which to discriminate between builders and flippers. By some estimates there is a terawatt of renewable capacity in Americas queue for connections Much the same is true in many other countries. Thomas Egebo the boss of Energinet the state-owned firm which owns and operates the Danish grid says the state of affairs is exactly like in the UK . There needs to be a project and we need to sign things and then we can apply to build [new grid capacity]. The situation in America which has many grids and even more lawyers is dire. A study published last year by Lawrence Berkeley National Laboratory in California found that the average grid connection in America in 2021 had taken three years to be completed and that as elsewhere the lag was getting longer. By some estimates there is at least 1400 gw of new renewable capacity in Americas queues for connection. Chill Sun Solar which shares the queue to connect to the Californian grid with about 220 GW of other solar plants and energy storage facilities may be in luck. A new transmission line called Greenlink Nevada designed to serve solar generation in that state is currently being planned by the Bureau of Land Management and is due to be in service by 2026. But like the rest of Americas planned energy infrastructure it will be at the mercy of a permitting process which is vulnerable to court cases and delays. As the Greenlink plan shows there is some progress and regulators are beginning to understand that there needs to be more. In December 2022 Ofgem published new rules drawn up over eight frenetic months of consultation allowing National Grid to go ahead and expand in advance of grid-connection requests. It is also reforming its procedures so that projects in the queue can be required to hit milestones or lose their place though it is leery of making the rule retroactive (which would be needed to shorten the queue quickly). In America the Federal Energy Regulatory Commission has expressed concern that grid operators may not be planning transmission on a sufficiently long-term forward-looking basis to meet transmission needs driven by changes in the resource mix and demand and last year gathered feedback on proposals designed to push grid operators to invest according to longer term plans as a way to end todays piecemeal and inefficient development. Its new proposals are expected this summer. There are also plans afoot to reduce the delays planning permission and building permits can cause. In November the European Commission the executive of the European Union agreed to a new set of temporary regulations giving renewable-energy projects a presumption of overriding public interest. The commission also agreed that environmental-impact assessment for grid reinforcements should be limited in order to facilitate the integration of renewable energy. Such moves are vital when it comes to building new long-range connections which may be required to relieve congestion as the grid expands or to connect generators and loads across large distances. The construction of Ultranet a series of high-voltage cables that will link wind power from the North Sea to industrial users in southern Germany requires its developers to obtain some 13500 building permits says Tim Holt who sits on Siemens Energys board. The tech is ready but the permits are not. In February massive transformers destined for Ultranet were sitting ready for shipping at the companys factory in Nuremberg. In the developing world where grids are typically too small to meet todays demand let alone the futures some of these problems look less fraught. Expanding a growing grid further will often be easier than booting a mature grid out of its slumber. India for instance has built out its grid at a furious pace in the past decade. Its Ministry of Power reports the construction of 161000km (100000 miles) of transmission lines since 2014 when Narendra Modi became prime minister. The percentage of the population which has access to a grid connection has shot up from 76% in 2010 to 99% in 2020 according to World Bank data. The existence of connections does not in itself mean a stable supply. Keeping supply and demand balanced often requires load to be shed leading to blackouts across Indias cities. But according to the Ministry of Power the amount of kerosene used to keep generators running in rural parts of India fell from 9bn litres to 2bn litres between 2015 and 2021. That suggests that people were able to rely on the grid to provide much more of the energy they needed. As the demands of grid decarbonisation have become apparent India has simply kept building. In December 2022 the ministry announced that it would spend 2440bn rupees ($30bn) to build another 50000km of electricity transmission lines by 2030 to connect renewable generation often in fairly remote places to growing demand. It is planning to connect 500 GW of renewable capacity of all kinds by the same date. Indias huge grid expansion might seem to run counter to a popular narrative about developing-country electrification. It has often been suggested that building big centralised grid infrastructure was too slow expensive and wasteful in these markets. One alternative might be a more rapid growth in distributed generation specifically solar panels that could be connected into local microgrids. Each microgrid would be capable of running as an independent island but also able to exchange energy with its neighbours. There would be no need for any big grid backbone because solar panels would be plentiful and distributed widely enough that there was always enough electricity being generated nearby. In truth though this is something of a false opposition. Distributed solar panels are capable of pumping out lots of electricity. But as Brent Wanner the grids guru at the International Energy Agency an intergovernmental think-tank points out making such resources your mainstay is a costly undertaking. Economies of scale make small-scale or rooftop solar several times more expensive than the same capacity would be if installed at a grid-connected solar farm. Distributed solar panels can reduce the demands on an existing grid by supplying some energy locally but the economic case for a wholesale move to decentralised power has yet to be made. Developments of both sorts are needed. China inclines towards big grids. State Grid which runs the synchronous grid that spans most of the country is the worlds largest employer and runs its largest machine. Like India China is benefiting from the momentum of recent grid expansion. The country only reached 100% electrification in 2012 at a point when the immense demands of grid decarbonisation were already becoming apparent and investment in infrastructure was being used to drive the economy. As a result it has simply continued to invest. That said it too is seeing increasingly long queues for connection according to Chongqing Kang who studies electricity systems at Tsinghua University in Beijing. He thinks the lack of real-time pricing is part of the problem. Those trying to transform the grid are left asking the government to update its policies. Even with all the planning reform and queue-management wizardry in the world the waits will not all go away. Construction of grid infrastructure cannot be done on a purely laissez-faire basis. Each grid is a single cohesive machine and new connections must be made carefully so as not to upset their delicate balance. The same is true of the new transmission capacity needed. But there things are being made easier by a 19th-century technology that is only now coming of age. This article appeared in the Technology Quarterly section of the print edition under the headline Hurry up and wait Discover stories from this section and more in the list of contents Published since September 1843 to take part in a severe contest between intelligence which presses forward and an unworthy timid ignorance obstructing our progress. Copyright The Economist Newspaper Limited 2023 . All rights reserved.
641
BAD
Admiral Rickover's 'Paper Reactor' Memo (1953) https://whatisnuclear.com/rickover.html jameshart By Admiral Hyman Rickover* 1953-06-05 Reading time: 4 minutes Important decisions relative to the future development of atomic power must frequently bemade by people who do not necessarily have an intimate knowledge of the technical aspectsof reactors. These people are nonetheless interested in what a reactor plant will dohow much it will cost how long it will take to build and how long and how well itwill operate. When they attempt to learn these things they become aware of confusionexisting in the reactor business. There appears to be unresolved conflict on almost everyissue that arises. I believe that this confusion stems from failure to distinguish between the academic andthe practical. These apparent conflicts can usually be explained only when the variousaspects of the issue are resolved into their academic and practical components. To aid inthis resolution it is possible to define in a general way those characteristics whichdistinguish the one from the other. An academic reactor or reactor plant almost always has the following basic characteristics: On the other hand a practical reactor plant can be distinguished by the followingcharacteristics: A common example can be given to indicate the application of the above generalities: A fairly conventional academic power reactor might use natural or slightly enricheduranium rods in which the burn-up is a minimum of 10000 mwd / ton. The fissionproducts are confined to the fuel element by a simple cladding technique. The elementsoperate in high pressure water at 600 F. 1 In the practical reactor difficulties are encountered. No element of the above type hasbeen carried to more than 4000 mwd / ton. Eight years of work at Hanford and relatedlaboratories have failed to produce a cladding technique which gives really satisfactoryperformance in water at 200 F. At 600 F uranium reacts violently when exposed to water.The Chalk River experience shows the difficulty of maintaining a plant in which somefission products have escaped. The tools of the academic-reactor designer are a piece of paper and a pencil with aneraser. If a mistake is made it can always be erased and changed. If the practical-reactor designer errs he wears the mistake around his neck; it cannot be erased.Everyone can see it. The academic-reactor designer is a dilettante. He has not had to assume any realresponsibility in connection with his projects. He is free to luxuriate in elegant ideasthe practical shortcomings of which can be relegated to the category of mere technicaldetails. The practical-reactor designer must live with these same technical details.Although recalcitrant and awkward they must be solved and cannot be put off untiltomorrow. Their solutions require man power time and money. Unfortunately for those who must make far-reaching decisions without the benefit of anintimate knowledge of reactor technology and unfortunately for the interested public itis much easier to get the academic side of an issue than the practical side. For a largepart those involved with the academic reactors have more inclination and time to presenttheir ideas in reports and orally to those who will listen. Since they are innocentlyunaware of the real but hidden difficulties of their plans they speak with greatfacility and confidence. Those involved with practical reactors humbled by theirexperiences speak less and worry more. Yet it is incumbent on those in high places to make wise decisions and it is reasonableand important that the public be correctly informed. It is consequently incumbent on allof us to state the facts as forthrightly as possible. Although it is probably impossibleto have reactor ideas labeled as practical or academic by the authors it isworth while for both the authors and the audience to bear in mind this distinction and tobe guided thereby. H. G. Rickover Captain USN June 5 1953 *This is a reproduction of Rickovers famous paper reactor memo publishedon the web by N. Touran on 2022-08-23. The original was published in The Journal of ReactorScience and Technology Volume 3 No. 3 (which is classified) and reproducedmany times e.g. in Congressionaltestimony . 1 - Interestingly in the decades after Rickover published this memo we nowhave reactors that get beyond 60000 mwd/ton burnup using low-enriched uranium in hightemperature/high pressure water. Whatisnuclear.com 2007-2023 | CC-BY-NC | Contact | About | GitHub | Twitter
null
BAD
Adventures in Debian's Qt Land https://perezmeyer.com.ar/blog/2023/06/08/adventures_in_debian_s_qt_land/ jandeboevrie Debian (I might as well say we this is the beauty of it) is about to release Debian 12 aka Bookworm . Let's take a quick look at what is new in Debian Qt land. Bookworm has Qt 5.15.8 which is nothing but great news. KDE will be switching to Qt 6 sooner than later and Qt 5 has been a fun ride but Dmitry Shachnev and I needed a break or at very least not handling two Qt versions. But in the end I need to be fair: you REALLY need to thank Dmitry for Qt 5. He has been the man power behind it in 99.5% of the cases. This will be the first Debian release to have official Qt 6 packages. NOTHING would have happened if it weren't for Patrick Delta-One Franz standing up to maintain it. BIG kudos to him! Well there is a little lie in the paragraph above. Thanks to The Qt Company and ICS the current Qt 6 version 6.4.2 is also available as Bullseye's backports . The Qt Company really also helped us here by providing us almost-to-be-released tarballs of Qt 6.4.2 so we were able to push them to unstable and do a transition in time for freeze thanks a lot for that! So what is the Qt 6 state? At the binary side all but OpenGL ES support should be there. Sadly this was discovered too late in the release process and we still might need help maintaining it (read the link to know why!). We are still not building the documentation. Properly building the whole documentation as with Qt 5 would require all the Qt submodules' source code in one place which we can't (easily?) do in Debian. So building the doc means hacking the build system and getting semi-linked documentation much like with Qt 5. Now if you think you have an idea to solve this... we are happy to hear from you! Another great thing to know about Qt 6 is that thanks to Helmut Grohne pure Qt 6 applications should be able to cross compile. Applications using multi-arch enabled libraries ought to work too. Even more many Qt submodules themselves should also cross compile! Not all of them as we missed some patches in time but hey if you need to cross compile Qt you surely can apply them yourselves! And finally tests unit tests. In Qt 5 we had some of those but none yet in Qt 6. This is one of the areas I would love to be able to put time... but time is scarce. In my point of view the Debian 13 Trixie development cycle will see Qt 5 diminishing it's usage and Qt 6 becoming the major Qt version used but from the Qt 4 experience I do not expect Qt 5 being dropped during this release cycle... let's see what the future brings us. While I mentioned Dmitry and Patrick many more people helped us reach this place. I personally want to thank the people behind the KDE software both upstream and of course the Debian maintainers. You should be thankful with them too many hours of effort go into this. And thanks to you our dear users. We are normally overflowed with what we have in our hands and might not be up to the task sometimes but hey you are part of the reason we are doing this! There are no comments yet. You can use the Markdown syntax to format your comment. Probably Amrica 's sourthernmost Debian Developer.
null
BAD
Adventures in Electronic Music (2011) (charlespetzold.com) Sometimes it doesn't take much to nudge one's life in a little different direction and for this direction to then blossom into a career and decades of delights working in the computer industry. Probably sometime around 1969 when I was 16 years old I bought an album entitled Switched-On Bach by Walter (later Wendy) Carlos and Benjamin Folkman which had been released the previous year. I wasn't unusual in this respect. Many people bought this album. It was one of the best-selling classical albums of all time and it received three Grammy Awards. Switched-On Bach consists of several compositions by Johann Sebastian Bach (16851750) played on a Moog electronic music synthesizer including a complete Brandenburg Concerto No. 3 with a spacey middle movement composed by Carlos and Folkman for the recording. I must have read some articles about the album because it was very clear to me how it was made: Although the Moog synthesizer had a keyboard it was essentially monophonic so each melodic line had to be recorded separately on a multitrack tape recorder. Switched-On Bach was an important event in my life. I loved the music and it initiated an interest in classical music that led to a collection of about 1200 classical LPs by the early 1980s eventually to be replaced with a current collection of well over a thousand classical CDs. But I also knew that I wanted to make electronic music myself. I didn't grow up in a household with musical instruments (apart from my mother's old accordion in the closet) and although I could read music apparently I picked it up in some music appreciation course and never lost it I couldn't play anything. Nevertheless tunes of my own invention would often rattle around in my head. I wanted to hear them for real and realizing them using electronic music seemed ideal. This is the story of how I eventually built my own electronic music instruments while teaching myself digital electronics computer hardware and assembly language programming. Roughly between the mid-1970s and the mid-1980s this hobby (and it was never really more than a hobby) led directly to my being able to write assembly language programs for the IBM PC in the early 1980s which led to my career at PC Magazine beginning in 1984 which led to the rest of my career. This hobby also gave me the background knowledge and understanding to write the book Code: The Hidden Language of Computer Hardware and Software (1999). Several years ago when Programming in the Key of C#: A Primer for Aspiring Programmers was being published I wanted to dedicate the book to Wendy Carlos and Bach but the page count was such that there was no room for a dedication page. Instead I included this paragraph in the Prelude: That my talents do not lie in music composition will be confirmed by the many audio files I've recently created from my old tapes and which are playable from this page. In the summer of 1972 between my freshman and sophomore years attending Stevens Institute of Technology I got my first job: I was a busboy at the Cafe Contempo in Stern's Department Store in the Woodbridge Center shopping mall near my home in New Jersey. This job was specifically for the purpose of earning enough money to buy a $500 Univox electronic piano. I say electronic piano rather than electric piano because electric pianos (like the Fender Rhodes) contain actual hammers striking wires or metal objects. My Univox was an electronic piano which produced sound entirely through electronic components much like an electronic organ. It had five octaves (61 keys) and a foot pedal for sustain. Inside was a long circuit board containing 61 separate oscillators that produced a tone somewhere between a piano and a harpsichord but not as pleasant as either. Each key was basically a switch that moved a little flexible spring in contact with a U-shaped piece of thick wire. I took the piano back to college in my sophomore year. I had a private dorm room and headphones (as well as a big guitar amp) so I could play it without disturbing anyone and I taught myself a little piano. My early repertoire consisted of the first movement of Beethoven's Moonlight Sonata Scott Joplin's Magnetic Rag some easy Bach from the Notebook of Anna Magdalena some stabs at the Two-Part Inventions and bits and pieces of some Debussy Preludes . When I turned 21 in February 1974 I came into some money from a trust fund. Most of the money had already gone to fund my college education and when I allocated enough for my final three semesters I had about $3000 left. This was just enough money to turn a dormitory room into a beginner's electronic-music studio. In early September 1974 I went to one of the famous New York City music stores on 48th Street in Manhattan (either Sam Ash or Manny's) and bought two essential items: Here's an ad that I snagged from the SynthMuseum web site: The 3 octave keyboard was typical of synthesizers in this price range. I was actually intending to buy a Minimoog (which was about the same price) but after playing around with the ElectroComp I realized that the plugboard along the top could be used for some interesting techniques. The Minimoog was a nice little instrument but I never regretted my choice. I believe the Teac 3340S cost about $1200: This was a very popular tape deck for home studies and it had several important features. Most consumer reel-to-reel tape decks and in the 1960s and even into the 1970s their presence meant that you were in the home of an audiophile ran at 3 and 7 inches per second (ips). The Teac 3340S supported 7 and 15 ips. Consumer tape decks used 7-inch reels. The Teach 3340S was roomy enough to use 10-inch reels. Consumer tape decks recorded and played back in stereo: two tracks going one way on the -inch tape and two tracks going the other way (just like cassette tapes). The Teac 3340S supported a four-track mode to use the full width of the tape in one direction. By 1974 professional music studios had moved far beyond four-track recording so those of us building home (or in my case dormitory) studios around four-track tape decks would often need to remind ourselves that Sgt. Pepper was recorded on four-track machines. According to the Wikipedia entry for Tomita's 1974 album Snowflakes are Dancing he used a Teac 3340S but only in conjunction with other equipment including a 16-track Ampex deck. The 1960s and early 1970s was the era of analog synthesizers built pretty much entirely from resistors capactors transistors wire and solder. The circuitry that actually made the sounds was called a voltage-controlled oscillators (VCO) which meant that the frequency of the waveform generated by the oscillator was proportional to an input voltage or rather exponentially proportional to the voltage. Most commonly a 1-volt increase raised the frequency by an octave. For example if a 5-volt input caused the VCO to generate a waveform at Middle-C or 261.63 cycles per second a 6-volt input would generate the C one octave higher or 523.251 Hertz. This exponential relationship between voltage and frequency meant that you could actually control it from an organ-type keyboard. The first person to attach a keyboard to a voltage-controlled oscillator was Robert Moog (19342005) in the early 1960s. Electronic music synthesis had existed prior to that time but the keyboard really made it user-friendly to composers and musicians. The synthesizer keyboard is basically a voltage divider: A 5-octave keyboard for example contains a string of 60 resistors of the same value wired in series. Put a voltage at one end and a ground at the other. Each key taps a spot between two resistors and the voltage at that point is output from the keyboard. This design meant that the keyboard were monophonic. They only output one voltage. If you tried playing a chord on the keyboard the output voltage was based on the highest key and the others would be ignored. But even if the keyboard was able to output multiple voltages there wasn't much you could do with all that information. Small synthesizers contained just a few VCOs. The ElectroComp 101 had four. But even larger synthesizers wouldn't try to have a bunch of VCOs playing at once. These were analog components they were affected by heat and one of the big challenges was just keeping them in tune. There were some keyboards (like the ElectroComp 101) that output two voltages one from the highest key and one from the lowest. This allowed you to control two VCOs playing separate notes but I never found that this worked very well because pressing only one key in this mode would cause both VCOs to be based on that key which sounded different from a single VCO. VCOs were capable of generating sine waves but sine waves by themselves are rather boring and lifeless. You really need a lot of sine waves to start sounding like real sounds and you couldn't do that because you couldn't keep that many VCOs in tune. It was much more common for VCOs to generate simple waveforms rich in harmonics like square waves and triangle waves and sawtooth waves. Those were the biggies. Square waves sounded kind of woody like a clarinet or oboe while sawtooth waves sounded more like string instruments like the violin. As you press and release a key on the keyboard you don't want to just turn on the VCO and then turn it off. That would sound too mechanical. Instead when you press a key on the keyboard you want the sound to rise in volume (called the attack) and then decline a bit (the decay) and then keep at a level while you hold the key (the sustain). When you release the key you want to volume to decrease down to zero (called the decay). This combination of attack sustain decay release is known as an envelope so to produce decent sounds the synthesizer contains envelope generators triggered by the pressing and releasing of the keys. These envelope generators output voltages to voltage-controlled amplifiers or VCAs that modify the volume from the VCOs. But even this isn't enough. You really want the quality or timbre of each note to change over its duration so instead of a note sounding like BHHHHH it sounds more like BHWOUP. Even if the note is only a tenth second in duration this makes a big difference. To make the sound change in quality over its duration you run the output from the voltage-controlled oscillator into a voltage-controlled filter (or VCF) and then run the output from that into the voltage-controlled amplifier. This VCF is also controlled from an envelope generator and attenuates some of the harmonics of the harmonically-rich waveform output from the VCO. (You can also run an envelope generator into the VCO so the frequency of the note changes over its duration.) This attentuation of harmonics from square waves or sawtooth waves is why analog synthesis of this type is sometimes called subtractive synthesis. You begin with a waveform with lots of harmonics and then you remove some. Opposite to this is additive synthesis in which a complex waveform is built from sine waves but this is a more complex technique. When you're using one of these synthesizers you set it up with one of more VCOs VCFs and VCAs controlled by envelope generators to produce a particular quality of sound. This combination of settings is known as a patch even if everything is done with knobs rather than actual patch cords. Then you record a track on the tape recorder and when you're done with that patch you set up the synthesizer with another patch. When creating music in this way on a four-track tape deck like the Teac 3340S you're recording on a single track at a time and you need to listen to what you've already recorded on the previous tracks. The Teac 3340S allows this: Each of the four tracks can be independently selected for recording and independently selected for listening to what's on the tape or the signal that's coming into the tape deck. If you're creating a composition that has any notion of a beat one of the first things you do is lay down a click track. This is one track that sounds like the click-click-click of a metronome. (Bach was a favorite composer to realize in synthesis partially because his music sounds good when played at a constant tempo and that made the click track easy. Mahler for example would be much more problematic because you'd have to establish all the tempo changes for this click track. Tomita's synthesis of Debussy music in 1974 was particularly daring.) You can then listen to that click track on Track 1 while recording some music on Track 2. Then you can listen to Track 1 and Track 2 while recording some more music on Track 3. Then you can record on Track 4 and by this time you probably don't need the click track any more so you can record something else on Track 1. The result is four tracks all independently recorded. Like all decent tape decks of this period the Teac 3340S had separate playback and record heads. Actually it has three sets of four-track heads. From left to right the tape passes an erase head for erasing prior to recording a record head and a playback head. And this is a problem because if you're listening to previous tracks through the playback head and recording through the record head the tracks will be out of sync. The record and playback heads in the Teac 3340S are separated by about an inch so at a tape speed of 15 ips the difference would be a very noticeable. Obviously the people at Teac were aware of this problem and that's why the Teac 3340S has a feature they called Simul-Sync. By flipping switches located right on the box that covers the heads you can switch the record head for any channel temporary into a playback head. The playback qualities suffers but you're only using it to keep in synchronization with the previous tracks. On top of everything else in the Teac 3340S Simul-Sync is the essential feature for multi-track electronic music composition. If you need more than four tracks and you only have one four-track tape deck you can use a technique called ping-ponging. In the example above once you finish recording on Track 4 you can mix Tracks 2 3 and 4 and record that back to Track 1. This frees up Tracks 2 3 and 4 which you can then use for more stuff. This technique can be continued indefinitely but sound quality suffers if it's repeated too much. Finally I don't think anybody really talked about it much but everybody working with synthesizers and tape decks discovered a very simple but powerful technique: If you record something an octave lower at a tape speed of 7 ips you can play it back at 15 ips and it sounds like you can play a lot faster than you really can. I no longer own the Electrocomp 101 synthesizer. Sometime in the mid-1990s I sold it to a little store on Mercer Street in Manhattan that sold vintage electronic music and recording equipment. But I recently dragged the Teac 3340S out of storage cleaned it up a bit took the back off sprayed some contact cleaner into the poteniometers and switches mounted my old tapes and discovered that it still worked fine after 35 years. All the tapes have now been converted to audio files for your listening I was about to say pleasure but that's definitely the wrong word. Most of what I had on tape from the period 1974 to early 1977 is either Bach or my own experimentations. My own stuff is obviously of the Ross Geller school of composition. (If you're fortunately not familiar with the sitcom Friends Ross Geller's artistic passion is the synthesizer compositions that he plays with pretentious flair embarassing everyone who hears them.) The first piece I recorded using the ElectroComp 101 and the Teac 3340S tape deck is named simply Composition and is dated September 19 1974. Obviously I was mostly playing around exploring what the synthesizer could do but I also detect a motif or two. Composition 9/19/74 (2:14) A day later my first attempt at a Bach piece made me appreciate more than ever the skill behind the creation of Switched-On Bach . Even playing it at half speed I couldn't get the timing right. I did manage to slow down at the end however. Bach: Two-Part Invention in B Major (1:28) On September 21 1974 an acquaintance named Joe Terlizzi came by my dormitory and we did a synthesizer realization of a little string quartet he had written. If he ever discovers this site and gives me permission to post the result I will do so. Five days later I tried another Bach Two-Part Invention . This one starts off much stronger and punchier than my first attempt but then it collapses with just too many timing problems and whatever it was that I did in the last bar it's obviously not what Bach composed. Bach: Two-Part Invention in E Major (1:54) A day later I tackled another one. As with the others all I can hear are the mistakes including spots where the recording cuts off for some reason. Bach: Two-Part Invention in A Minor (1:34) The next day I taped another composition. (This was September of my Senior year in college. Didn't I have any actual schoolwork to do?) Variations on E 9/28/74 (2:55) I was taping these pieces on a 7-inch reel labeled as providing 60 minutes in each direction or 2 hours total but that's for a recording speed of 3 ips. I was recording at 15 ips and using all four tracks which meant I couldn't use the other direction. The tape really provided only 15 minutes and it was nearly full. I decided to use up the rest of the tape doing stuff where it wouldn't matter if the tape happened to run out (which it does). Study 10/2/74 (2:10) I guess I figured that after a month of experimenting and in celebration of starting another 60 minute tape I was ready to tackle something big and by big in the Bach world we mean a Brandenburg Concerto . This is the first movement of Bach's Brandenburg Concerto No. 2 which took about three weeks and was finished in early November 1974. Bach: Brandenburg Concerto No. 2 1st movement (4:55) I believe this is the only time I used ping-ponging. I'm not sure how much was involved but the sound density is still pretty impressive although the overall performance is not. The experience must have had at least one benefit: When I tried another Bach Invention on November 24 and perhaps significantly one that I could almost play for real at full speed with two hands on the Univox piano it didn't come out too shabbily. Bach: Two-Part Invention in C Major (1:07) On December 8 1974 my friend Steve Rounds came over and we put down a little composition he had written with my synthesizer in mind. It's a pretty little piece with lots of atmosphere and some interesting effects but I'm not sure I could do it justice even if he gave me permission to post it. I remember when we mixed it down to cassette he was fiddling with the volume controls like crazy to get the balances right. It was winter vacation and I dragged my equipment to my mother's house. In the two days after Christmas I tried some more Bach. Bach: Little Fugue in G Minor (2:37) On January 12 1975 I tackled one of the Bach three-part Inventions: Bach: Three-Part Invention in B Minor (1:32) On the same day I finished the second tape with another composition that I labeled on the tape as improvised as if the previous ones were not! Composition 1/12/75 (1:00) A month later on Tape No. 3 I did another. One of the big differences with this one (aside from the length) is that I also used the Univox electronic piano. It's the fuzzy harpsichordy polyphonic sounding thing. I'm not sure whether the stuff at the end was accidental or deliberate. Composition 2/12/75 (5:55) And then I put something down on tape to which I assigned an actual title. An echo effect is easy with a tape recorder: As you're taping you run the output from the playback head back into the record head. I taped it at 7 ips for playback at that speed; at 15 ips the echo is too fast. Fantasy in Echo 3/26/75 (6:04) Almost a year passed during which I graduated college got a job at New York Life Insurance Company as an Actuarial Student and moved to a studio apartment in New York City. When I came back to the piano synthesizer and tape deck I recorded another composition with a real title of sorts. I got the effect you hear at the beginning by setting up the synthesizer to trigger an envelope repeatedly and then ran the piano through the synthesizer and tried to play in synchronization with it. Thing with Piano February 1976 (7:08) Nine months passed. In November 1976 I began a composition that I never finished and should have erased I tried a couple pop songs (unsuccessfully) and then demonstrated complete disrespect by attempting a synthesizer rendition of part of one of Bach's greatest works. The glissando effects near the end are deliberate but I didn't get the rhythm quite right. (The glissando is supposed to end on the beat.) Bach: Goldberg Variations: Variation No. 1 (1:05) On Bach's birthday in 1977 I concluded the third tape (this one 90 minutes rather than 60 minutes) by recording a piece whose full title is On the Difficulty of Taping the Music of Bach from the Short Wave Radio During a Thunderstorm. (You know the static whistling noises the tape running out etc.) On the Difficulty... 3/21/77 (3:17) Clearly I was at a compositional dead end and I needed a good spark of inspiration to get me going again. Around September 1976 I began to become familiar with the work of the minimalist composers Philip Glass Steve Reich and Terry Riley. Glass had co-founded a record label Chatham Square Records and recorded Music with Changing Parts and Music in Similar Motion and I know I had the Deutsche Grammaphon recording of Steve Reich's Drumming and Six Pianos and of course Terry Riley's In C was well known. In New York City at this time there was plenty of opportunity to see Philip Glass's and Steve Reich's ensembles perform both in downtown establishments I know I saw an early performance (not the first) of Steve Reich's wonderful Music for 18 Musicians at The Kitchen and was totally enthralled and uptown: I saw the November 1976 performance of Einstein on the Beach at the Metropolitan Opera House and loved it. (Still do.) The minimalists reminded me a lot of Bach. Intricate counterpoint was often involved and Glass even used organs. In the summer of 1977 I wrote two short four-voice pieces and recorded them on the four tracks of the Teac tape deck with the ElectoComp synthesizer. Etude No. 1 6/23/77 (2:42) Etude No. 2 7/2/77 (3:47) I do not have the score for the first one but I do for the second. These (and other scores and scraps) can be perused in Appendix I. Both these pieces were obviously inspired by Steve Reich's process music such as Piano Phase and Clapping Music where simple motifs playing in unison are then offset by a beat progressively moving further apart and then meeting again at the end. I found the process of recording these pieces extremely difficult. But I have a certain fondness for them. Even though I played them by hand (at half-speed an octave lower of course) I was aiming for a very mechanical sound and I think I got that sound. As more minimalist compositions played themselves in my head I began to wonder: Is there a way that such music might be automated? I didn't immediately think computer. This was after all 1977. But I did think sequencer which were hardware devices of the era that could store a sequence of notes and play it back through an analog synthesizer. But they were rather expensive. I then started thinking about the old Univox electronic piano I still had. It had become very cranky. Each time a key was pressed a little spring made contact with a thick wire and often this contact would generate a tiny spark that tended to corrode the metal and interfere with the electrical connection. But it was only the switches that were faulty and not the electronics. Might there be a way to salvage all the electronics in the piano and put some kind of gizmo on it that would play the piano electronics in an automated fashion? I imagined constructing a large array of simple slide switches 61 switches wide (for the 61 keys of the piano) and 16 switches high for a sequence of 16 notes and a total of 976 switches. But how would it work? Although I graduated with a Master's degree from an engineering and science college and I did have some knowledge in analog electronics going back to my adolescence I knew nothing about digital electronics. Instead I imagined some kind of electro-mechanical wiper that would spin around and make sequential contact with metal plates. It was time for some education. Two weeks after realizing Etude No. 2 on tape I bought a copy of Solid-State Electronic Projects from my local Radio Shack. Six weeks later I bought a book that I probably spent more time with than any other book in my life Don Lancaster's CMOS Cookbook . And I slowly taught myself digital electronics. Instead of using an electro-mechanical wiper I discovered that CMOS switches were ideal for switching external electronics on and off. I also found a source for the type of slide switches I needed at 10 a pop so constructing an array of 976 slide switches suddenly sounded quite conceivable. I bought 10 switches to experiment with and then I bought 200 more enough for the top octave of the piano 13 switches across and 16 down. By December 1977 I had learned enough CMOS to get my first home-built electronic music instrument working. This was something I referred to simply as the Piano Sequencer. It never got larger than just the top octave of the piano and I have not been able to find a photograph. I have only one recording of it. The piano is on one track (heard right at the beginning) and the other three tracks have improvised synthesizer. Sequential Piano + Synthesizer December 1977 (9:09) I agree that it's not very interesting and goes on much too long. My interest in the Piano Sequencer faded as I was drawn to a more ambitious project. Again using CMOS for much of the logic and a little music chip I discovered in Don Lancaster's book I combined some memory chips so I could store sequences in memory and then play them back. The input devices were DIP switches and not only could I store notes in memory but I could store numbers indicating ranges of notes that were to be repeated. At a modern dance concert (Laura Dean Dancers and Musicians doing Dance and Spiral probably) I had met a woman studying for her MFA at NYU and I persuaded her to come back to my apartment to hear a work in progress on this strange machine. She said it sounded like music for a spy movie. The relationship didn't work out but the title became Overture for a Spy Movie: Overture for a Spy Movie 6/18/78 (15:20) This was the first piece I recorded that's played entirely by machine. The lower voice is a naked triangle wave; the upper voice ran through the ElectroComp synthesizer and I was able to define the repetitions so the melodic lines get longer and longer. Getting the composition on tape was a relief because then I didn't have to worry about losing power and having the entire composition instantaneously draining entirely from RAM. By November 12 1978 the Four-Channel Sequencer was as complete as it was ever going to be and sat on my dinner table in the form of 18 interconnected circuit boards. I haven't been able to find any circuit diagrams for this machine but here's a photo: Here's a rather longer composition I was able to encode in the completed machine and imaginatively titled Concerto 1978. There must have been a score for this but it hasn't turned up. Some notes I made in 1979 indicate that the first movement was composed in October and was supposed to have synthesizer accompaniment. Concerto 1978: 1st Movement (25:40) The second movement which no longer exists in any form was composed in August for solo piano. (By this time I had a real piano in my apartment a Baldwin upright but apparently I didn't have any microphones.) The third movement was also supposed to have synthesizer accompaniment. There was some damage to the tape on which this was recorded. Concerto 1978: 3rd Movement (15:14) Eight days later I recorded Midnight Circus which I had composed and programmed into the sequencer in a week. Midnight Circus (12:36) Considering that everything had to be programmed into this machine by manipulating tiny switches I'm surprised I did so much on it. I have two more examples of the Four-Channel Sequencer but both are unfinished. This one was recorded on March 23 1979: Unfinished Composition 3/23/79 (6:07) The second was recorded June 3 1979 and also suffered some tape damage: Unfinished Composition 6/3/79 (15:12) Sometime while working on enhancements to this device I thought it might be useful to encode numbers that were neither notes or references to notes but codes that would tell the hardware what to do. I had the idea of storing these codes in memory and then having circuitry that would respond to particular codes and I remember very distinctly the moment one evening when I suddenly realized that I was reinventing the computer. This realization gave me permission to start considering using microprocessors for future music synthesizers and I began disassembling the sequenzer. It had much too large a footprint and I knew I would need the parts for something much more ambitious and which I was already designing. Sometime in 1977 (I suspect) at a local store called Computer Mart I bought a copy of Computer Music Journal Volume I Number 2 (April 1977). This was a fascinating issue. The lead story was Design of a Digital Oscillator which will Generate up to 256 Low-Distortion Sine Waves in Real Time by John Snell. There were a couple related articles and also a reprint of John M. Chowning's classic article The Synthesis of Complex Audio Spectra by Means of Frequency Modulation. Up to this time I was controlling notes using digital hardware but not getting involved in the generation of the actual tones. The concept of the digital synthesizer described by John Snell was revelatory to me. The idea is to generate sine curves digitally. At the time computers or at least the type of computers that people could afford weren't fast enough to do this in software. So you move the job to hardware. In the center of the digital synthesizer is a Read-only Memory (ROM) storing values of the sine function. This ROM is accessed at a constant sampling rate (about twice the highest frequency you need to synthesize) incrementing the address with a factor that is proportional to the desired frequency. Yet another article in that second issue of CMJ (Table Lookup Noise for Sinusoidal Digital Oscaillators by F. Richard Moore) described the math involved to get the digitization noise down to a tolerable level. After generating the sine curve digitally it must be multiplied by an amplitude. (The multiplication turns out to be the most time-consuming part of the process.) The result is sent to a Digital-to-Analog Converter (DAC) which converts it into a voltage. Of course one sine curve would be boring so you need to construct the synthesizer so that multiple sine curves are generated simultaneously. Because generating each sine curve is a series of steps (incrementing the phase angle accessing the sine curve ROM multiplying by an amplitude accumulating with other sine curves) the hardware is constructed as a pipeline so that while one sine curve is undergoing multiplication by the amplitude the next sine curve is being accessed from the ROM. As the title of John Snell's article indicates in theory 256 sine curves could be generated simultaneously. You could then separate them into groups of 16 (or so) so that the sine curves in each group represented harmonics of a particular tone and you could generate a complex waveform in a process called additive synthesis. If you were generating 256 since curves you could then generate 16 synthesizer voices simultaneously. Or you could modulate the frequency of one sine curve by another sine cur
666
BAD
After a decade South Dakota's Amish are moving on (mitchellrepublic.com) ADVERTISEMENT ADVERTISEMENT TRIPP S.D. About two miles west of Tripp past a yellow warning sign with a horse and buggy and down a dirt road muddied from snow melt sit a set of red barns and white homes all with green roofs. The structures dotting the rolling landscape house South Dakotas lone Amish community a nine-family 60-person settlement that started in 2010 widely believed to be the religious group's first venture into South Dakota. But come this summer theyll be gone some of their homes are listed on Zillow and an auction is scheduled for April 28. We wanted there to be an Amish community here but seems like everybody Amish is more from Ohio or Pennsylvania where there are more trees Rudy Borntreger the communitys bishop or elder explained. I think it's so open nobody wants to join us. Now more people decided to move back to Iowa and Minnesota so kind of for unity's sake. Though their time in the state will be cut short and an aversion to technology deep focus on family and generally reclusive nature limited their socializing potential they left a lasting impression on the Tripp area and beyond community members say. ADVERTISEMENT We love em here Marion Ymker the owner and manager of Ymker Greenhouse and Landscaping in Armour where some of the Amish have worked for about a decade said. Were disappointed theyre moving. That feeling is mutual. Good country. Good area. Good friends Borntreger said speaking in a tone of finality on his time in South Dakota where hes spent around half of his adult life. Lot of things change in 13 years. Most businesses in Tripp all changed hands. Old friends passed on. The Amish are part of the Anabaptist Christian movement closely related to the Mennonites and more distantly connected to the Hutterites. They first arrived in the United States in the 1720s initially landing in Lancaster Pennsylvania which remains the largest single community of Amish in the nation numbering around 30000. Most of the Tripp Amish come from Tomah Wisconsin a settlement formed in 1969 that numbers more than 8000. Borntreger said his family and some others plan to go back to a different region in Wisconsin. Faith sits at the center of their lives: Bortntreger reads the Bible daily and attends church every other week. The children attend an Amish parochial school on the farm. Likely the most well-known characteristic of the Amish is an eschewing of modern conveniences. However there is a somewhat wide range of technology usage among Amish communities and most of those decisions come down to the discretion of the leadership of individual church districts. Borntreger described himself as a more conservative bishop. His familys large white home has no electricity though they do sometimes use propane lamps. The community also shares a pay phone. ADVERTISEMENT For shorter-distance communication a large bell sits in front of the Borntreger home; as the reporter arrived on the property for an interview for this story his wife donning a white bonnet and blue dress shook the instrument to hail him from a distant barn. In the chilly March air Borntreger wore a black hat low over his forehead. Opposite the round brim jointly framing his square jaw and forehead is a dense curly black beard. His black coat and dark blue pants are handmade by his wife from spools of thick denim. Completing his stringy 5-foot-10 frame is the only purchased portion of his outfit: grime-stained brown boots nearly up to the knee. Next to him is his youngest child who carries a bright yellow orange and green turtle toy a pop of color in the otherwise drab landscape. The humble lifestyle from clothes to horse-and-buggy transportation is about keeping a focus on God and family explained Erik Wesner who publishes Amish America an Amish news website. They adopt certain technologies but the way they approach technology is really trying to be thoughtful about how it's used said Wesner who became acquainted with the Amish while selling educational materials. What are the effects of that technology whether they're intended or unintended? What are the potential negative effects of that technology? Does the benefit that this tech brings us outweigh the negative side? Wesner used the example of a car to illustrate the point. While ownership of a personal vehicle does offer ease of transportation and the Tripp Amish community has a slate of drivers who often bring them back to Wisconsin for familial engagements or around the state it also has the potential to fragment and disperse the family. Borntreger shared some of these views tying the root cause of many social ills to a breakdown in family structure. ADVERTISEMENT Its important to have parents that are willing to work together to raise their children Borntreger a father of 14 said. If we look at overall situations I think some are neglected; they have questions and their parents dont have answers so the children may look elsewhere. While discussing family values he mentioned a fondness for Gov. Kristi Noem whose speeches he sometimes reads in local weeklies. However the Amish do not vote. We leave that to the rest he said. The Tripp Amish uprooted from their home in Wisconsin partially for elbow room which also served as the headline for a 2010 article in the Yankton Daily Press and Dakotan announcing their arrival. South Dakota as a landing spot was a budgetary decision. Theres a corn belt between here and there thats more higher-priced ground Borntreger explained. But its good ground we like it. It wasnt always easy: he recalled extreme drought in 2012 and the 2022 derecho which took down some of their buildings. But that didnt factor into the choice to relocate; instead the problem was an inability to attract and retain population. ADVERTISEMENT A set of six families referred to as the Founding Six by Jim Mize who sometimes serves as a driver on trips to Wisconsin rolled in during the first two years. Of that group only one Rudy Borntregers family remains. A total of around two dozen families lived in the community throughout the years though the settlement never numbered more than 90 people. While Borntreger chalked up the churn to familial ties being elsewhere Mize surmised that the inner workings of the group were not always the best. They won't tell you specifically why but you can read between the lines; they made a couple of comments that Rudy was hard to get along with he said. In Amish practice the bishop controls where they can work how much they can work the type of technology. Leaving that aside the impression the Tripp Amish left on local businesses was overwhelmingly positive. At Ymker Greenhouse in Armour where mainly younger Amish work a few days per week they showed exceptional skills in repairing buildings or working in the greenhouse. When it comes to craftsmanship you wont find better people to have Marion Ymker who owns the shop said. You don't have to worry about foul language. You don't need to worry about back talk or anything like that. Matt Mehlhaf the owner of the sale barn in Menno frequented by the Tripp Amish had similar comments. ADVERTISEMENT They're good people as far as I'm concerned. And they're good customers too he said. They're willing to work and work hard. And that's what it takes to raise livestock. In the end those takeaways are essentially all Borntreger would like to leave behind. When we first moved here people probably figured we were a little different. And I guess we are different but we're just trying to be friendly people make an honest living raise our families Borntreger said. Thats what our mission is I guess. Serve God and don't forget to pray. Jason Harward is a Report for America corps reporter who writes about state politics in South Dakota. Contact him at 605-301-0496 or jharward@forumcomm.com. ADVERTISEMENT
682
BAD
After years of setbacks NASAs SLS moon rocket is ready to fly (washingtonpost.com) CAPE CANAVERAL Fla. The rocket was late again. The initial launch date the end of 2016 was long gone. And in the spring of 2019 Jim Bridenstine the NASA administrator at the time was told itd be another year or more before NASAs Space Launch System would be ready. He was furious and threatened to replace the rocket with one built by the fast-growing private space sector such as SpaceX. But Bridenstines attempt to bench NASAs rocket was quickly rebuffed by the powerful interests including Sen. Richard C. Shelby (R-Ala.) the chairman of the appropriations committee. Those interests had shepherded the SLS through thickets of controversy since its inception more than a decade ago. Now after years of cost overruns and delays damning reports by government watchdogs and criticisms from space enthusiasts and even parts of NASAs own leadership the SLS endures as only a rocket built by Congress could. Today it stands on Launch Pad 39B at the Kennedy Space Center towering at 322 feet taller than the Statue of Liberty. NASA is scheduled to make its first attempt to launch Monday at 8:33 a.m. a test flight that is meant to propel the Orion crew capsule without any astronauts on board into orbit around the moon. A successful launch will mark a major milestone in NASAs quest to return astronauts to the lunar surface under its Artemis program. NASA officials have stressed that this is a test a mission known as Artemis I designed to see how the vehicle performs before they load astronauts onboard. That could happen as soon as 2024 when astronauts would orbit but not land on the moon. A landing could come in 2025 or 2026 . Some 100000 people are expected to jam the Florida Space Coast for the launch excited to watch NASA write a new chapter in the history of human space exploration. But even if the flight goes off as scheduled NASA officials warned that there could be surprises that force them to go off-script. In all of our excitement I want to remind people this is a test flight NASA Administrator Bill Nelson said in an interview. Were going to stress this thing in a way that we would never do with humans on board. And so I just want to bring everybody back to reality. Not only has the rocket never flown but NASA has struggled with all sorts of challenges in the tests designed to prepare the rocket for flight. Sensors detected a problem with the hydraulic systems that help steer the rocket cutting short an engine test last year . Earlier this year a fueling test and a simulated countdown were marred by a hydrogen leak and a faulty valve among other issues. Then again there is nothing simple about the SLS a huge complicated beast that holds 700000 gallons of supercooled liquid hydrogen and liquid oxygen. It has four first-stage engines and two solid-fuel side boosters. The avionics computers in the rocket have miles upon miles of cabling and more than 500 sensors. At liftoff it weighs 5.75 million pounds. The Artemis I mission is scheduled to last 42 days 3 hours 20 minutes sending the Orion spacecraft on a round trip mission that would reach 40000 miles beyond the moon and travel a total of 1.3 million miles. But in a way the odyssey to get to this point has been even more arduous an at-times painful path that shows how Washington works and ironically why NASA has been unable to return to the moon since the last of the Apollo missions 50 years ago. Standing atop its launchpad the SLS is a glorious sight but also a contradiction. More powerful than the Saturn V that launched the Apollo astronauts to the moon the SLS is a symbol of engineering prowess and American might that evokes the 1960s-era exploration nostalgia. But costing more than $23 billion it also is a monument to parochial congressional interests stultifying bureaucracy and contractor mismanagement. And as the commercial sector continues to develop new rockets the future of the SLS is unclear. As Casey Drier chief advocate and senior space policy adviser at the Planetary Society recently asked in an essay Given its cost the existing launch capabilities provided by private companies such as SpaceX Blue Origin and RocketLab and the real progress of super heavy-lift private rockets why does the SLS still exist? Artemis 1: New booster with a big mission NASAs new rocket will use legacy hardware meaning largely left over from the space shuttle program to launch the uncrewed Orion spacecraft around the moon and back. Along the way Orion will also launch 10 shoebox-sized cubesats miniature satellites to study the Moon and the environment around it. SLS BLOCK 1 ORION SPACECRAFT 322 feet Upper stage (Orion) Escape tower Protective launch cover Crew module Core stage Solar panels (4) Service module Spacecraft adapter Orion stage adapter Cubesat miniature satellites (10) Interim cryogenic propulsion stage Solid rocket boosters (2) RS25 rocket engines (repurposed space shuttle engines) (4) RL10 rocket engine A long-awaited and unusual return to the moon Orion will enter a long looping orbit around the moon that will take it farther from the Earth than any crew-capable space vehicle has ever gone. Theres no crew but sensor-packed mannequins will send back details on how the flight would go if there were one. Earth The total mission is scheduled to take about 42 days. Distant retrograde lunar orbit Moon Note: Orbit diagram is not to scale. Source: NASA WILLIAM NEFF/THE WASHINGTON POST Artemis 1: New booster with a big mission NASAs new rocket will use legacy hardware meaning largely left over from the space shuttle program to launch the uncrewed Orion spacecraft around the moon and back. Along the way Orion will also launch 10 shoebox-sized cubesats miniature satellites to study the Moon and the environment around it. SLS BLOCK 1 ROCKET ORION SPACECRAFT 322 feet Upper stage (Orion spacecraft) Escape tower Protective launch cover Core stage (liquid oxygen hydrogen fuel) Crew module Service module Solar panels (4) Spacecraft adapter Orion stage adapter Cubesat miniature satellites (10) Interim cryogenic propulsion stage RS25 rocket engines (repurposed space shuttle engines) (4) Solid rocket boosters (2) RL10 rocket engine A long-awaited and unusual return to the moon Orion will enter a long looping orbit around the moon that will take it farther from the Earth than any crew-capable space vehicle has ever gone. Theres no crew but sensor-packed mannequins will send back details on how the flight would go if there were one. Distant retrograde lunar orbit Earth Moon The total mission is scheduled to take about 42 days. Note: Orbit diagram is not to scale. Source: NASA WILLIAM NEFF/THE WASHINGTON POST Artemis 1: New booster with a big mission NASAs new rocket will use legacy hardware meaning largely left over from the space shuttle program to launch the uncrewed Orion spacecraft around the moon and back. Along the way Orion will also launch 10 shoebox-sized cubesats miniature satellites to study the Moon and the environment around it. SLS BLOCK 1 ROCKET ORION SPACECRAFT 322 feet Upper stage (Orion spacecraft) Escape tower Protective launch cover Core stage (liquid oxygen hydrogen fuel) Crew module Service module Solar panels (4) Spacecraft adapter Orion stage adapter Cubesat miniature satellites (10) Interim cryogenic propulsion stage RS25 rocket engines (repurposed space shuttle engines) (4) Solid rocket boosters (2) RL10 rocket engine A long-awaited and unusual return to the moon Orion will enter a long looping orbit around the moon that will take it farther from the Earth than any crew-capable space vehicle has ever gone. Theres no crew but sensor-packed mannequins will send back details on how the flight would go if there were one. Distant retrograde lunar orbit Earth The total mission is scheduled to take about 42 days. Moon Note: Orbit diagram is not to scale. Source: NASA WILLIAM NEFF/THE WASHINGTON POST The SLS was born in 2010 after the Obama administration canceled the Constellation program which would have built Orion and a pair of rockets one to fly astronauts to the International Space Station another to return to the moon. With the space shuttle set to be retired in 2011 after a second fatal accident back in 2003 Constellation was to be NASAs next big human-spaceflight program. And since it maintained much of the workforce particularly in Texas and Florida that supported the space shuttle it had key congressional support. But after years of cost overruns and delays the Obama White House determined that Constellation was on an unsustainable trajectory and perpetuating the perilous practices of pursuing goals that do not match allocated resources. So it moved to kill the program. Congress however had other ideas. It passed the NASA Authorization Act of 2010 that reinstated the Orion crew capsule and kept the heavy launch vehicle renaming it the Space Launch System. The law directed NASA to build the rocket using space shuttle and Constellation contracts so that even the engines used on the 1970s-designed space shuttle would power NASAs new SLS. It is not too much of a simplification to say that the SLS is the modern implementation of the space shuttle workforce Dreier wrote. For all its power the SLS is also notable for what it does not do. As companies like SpaceX Blue Origin and RocketLab are building rockets that are reusable returning to Earth so they can be flown over and over each SLS booster will fly only once falling into the ocean after liftoff never to be used again. The RS-25 engines on the rocket were repurposed from the space shuttle. Combined they flew in more than 20 shuttle missions including one from 1998. They were designed to be reused again and again. But on the SLS theyll be discarded for good. Asked about the long-term viability of the SLS Nelson said that SpaceXs Starship and other heavy-lift rockets such as Blue Origins New Glenn are still in development and not yet ready to fly though Starship appears to be getting close. What we know is that SLS is the only human-rated rocket that can go into space now Nelson said. And it will go farther deeper into space than anywhere we have ventured with humans before. Despite working with legacy hardware NASA and Boeing the prime contractor on the rockets main stage suffered through all sorts of setbacks and delays that have been chronicled in a series of critical reports issued by the Government Accountability Office and the NASA inspector general. In 2019 a report by the GAO for example found that NASA continued to pay tens of millions of dollars in award fees to Boeing for scoring high on performance evaluations even as the cost of the rocket was climbing and delays mounted. After issuing one award fee to Boeing a NASA official even noted that the significant schedule delays on this contract have caused NASA to restructure the flight manifest for SLS. Earlier this year Paul Martin NASAs inspector general told Congress that his office had calculated the cost for the first three flights of the SLS to be $4.1 billion each a price tag he said was unsustainable. NASA and Boeing pushed back on that analysis saying it included all sorts of unrelated costs and Dreier calculated that the per-launch cost would be between $876 million to $2 billion depending on how one accounts for its related overhead costs. All of those estimates are huge numbers that come as the cost of launch is going down not up as SpaceX and other companies compete for launch contracts in a robust commercial marketplace. As Martin said relying on such an expensive single-use rocket system will in our judgment inhibit if not derail NASAs ability to sustain its long-term human exploration goals to the moon and Mars. All of which has led to a chorus of protest from the rockets critics who deride it as the Senate Launch System saying it does more to create jobs in key congressional districts than open new frontiers of exploration. Thats in large part because construction of the rocket and the Orion spacecraft is spread out so that every state has jobs connected to the program. In all the SLS supports about 25000 jobs nationwide with a total economic impact of $4.7 billion according to NASA. In addition to primary contractor Boeing key contractors are some of the most powerful and influential in Washington: Aerojet Rocketdyne Northrop Grumman and the United Launch Alliance a joint venture of Lockheed Martin and Boeing. In his analysis Dreier noted that every single year since the program began in 2012 the SLS has received additional funding from Congress above what NASA had requested. In all the SLS has received an additional $335 million or 22 percent above NASAs requests on average per year. Despite cost overruns and ongoing delays there has never been a serious political threat to the SLS from Congress or the White House Dreier noted. This is independent of whether Democrats or Republicans are in control. It also has helped that the SLS program is stationed at the Marshall Space Flight Center in Huntsville Ala. the home state of Shelby the longtime chairman of the appropriations committee. The program has created about 13000 jobs and pumped $2.4 billion into the states economy and he has vigorously defended the program. Especially in 2019 when Bridenstine floated the idea of sidelining the rocket. At the time Shelby released a statement saying: While I agree that the delay in the SLS launch schedule is unacceptable I firmly believe that SLS should launch the Orion. Privately his aides angrily chastised NASA officials. The next day Bridenstine reiterated his support for the SLS program in a blog post saying the agency is committed to building and flying SLS while NASA officials realized the technical challenges of switching rockets midstream. Since then however Shelby announced he would retire at the end of this year leaving the SLS without one of its most ardent supporters. Under the Trump administration the Artemis program was given high priority especially by Vice President Mike Pence a space enthusiast who pushed NASA to move with a sense of urgency. President Bidens administration also has embraced Artemis meaning the program is the first human deep-space effort to survive subsequent administrations since Apollo. Despite the setbacks that plagued the SLS the program finally gained momentum recently competing a series of tests leading to launch including a successful hot fire test by firing its core stage engines for more than eight minutes last year. But nothing has come easy. This spring a few wet dress rehearsal tests to fully fuel the rocket and run a simulated countdown were cut short because of various problems including a faulty valve in the rockets second stage; problems with temperature readings of the propellant; and the malfunctioning of fans used to pressurize the mobile launch tower. NASA had to roll the rocket back to its assembly building at the Kennedy Space Center to make repairs before rolling it back out for another countdown attempt which also was cut short this time because of a hydrogen leak. NASA officials said they got enough data to proceed with a launch attempt and in recent days they have said the rocket is ready. NASA officials recently held a flight readiness review a days-long meeting where they discuss every aspect of the vehicle and the mission. It went well officials said with no major issues or dissent emerging. We did talk to the launch team said Jim Free NASAs associate administrator for the exploration systems development mission directorate. We talked to the flight team. We talked to the recovery team and then the management team and everybody said theyre ready to go. Still officials said that the first flight of a rocket is fraught with challenges and unknowns and that things could go wrong. The flight will push the envelope since the Orion spacecraft was designed to fly humans for a total of 21 days. On this uncrewed flight it is scheduled to fly for 42 days which will stress some of its systems NASA officials said. This is a test flight all right. And its not without risk said Bob Cabana NASAs associate administrator. We have analyzed the risk as best we can and weve mitigated also as best we can. But we are stressing Orion beyond what it was actually designed for in preparation for sending it to the moon with a crew and we want to make sure that it works absolutely perfectly when we do that. He said that some challenges that could emerge that can cause us to come home early and that's okay. We have contingencies in place. NASA rolled the rocket to the pad earlier this month preparing for launch. It can be seen for miles a stunning sight on the Space Coast skyline that has generated renewed enthusiasm for Americas space program here. We are going has become the NASA motto for the mission a slogan ready made for social media and marketing banners. A few hundred miles away along the Gulf Coast in Texas another rocket has been mounted on a launchpad: SpaceXs Starship booster which is preparing to go through a series of tests before its own launch attempt which could come within the next year. SpaceX it turns out is going too.
688
BAD
Al Jaffee king of the Mad Magazine fold-in has died (nytimes.com) Please enable JS and disable any ad blocker
752
BAD
Alain de Botton on existential maturity and what emotional intelligence means (themarginalian.org) Each month I spend hundreds of hours and thousands of dollars keeping The Marginalian going. For seventeen years it has remained free and ad-free and alive thanks to patronage from readers. I have no staff no interns not even an assistant a thoroughly one-woman labor of love that is also my life and my livelihood. If this labor has made your own life more livable in the past year (or the past decade) please consider aiding its sustenance with a one-time or loyal donation. Your support makes all the difference. $3 / month $5 / month $7 / month $10 / month $25 / month You can also become a spontaneous supporter with a one-time donation in any amount: Partial to Bitcoin? You can beam some bit-love my way: 197usDS6AsL9wDKxtGM6xaWjmR5ejgqem7 Need to cancel a recurring donation? Go here . Newsletter RSS Facebook Twitter Instagram Tumblr The Marginalian has a free Sunday digest of the week's most mind-broadening and heart-lifting reflections spanning art science poetry philosophy and other tendrils of our search for truth beauty meaning and creative vitality. Here's an example . Like? Claim yours: Also: Because The Marginalian is well into its second decade and because I write primarily about ideas of timeless nourishment each Wednesday I dive into the archive and resurface from among the thousands of essays one worth resavoring. Subscribe to this free midweek pick-me-up for heart mind and spirit below it is separate from the standard Sunday digest of new pieces: Maturity is the ability to live fully and equally in multiple contexts poet and philosopher David Whyte wrote in one of his most beautiful meditations . A generation before him Anas Nin took up the subject in her diary which is itself a work of philosophy: If you intensify and complete your subjective emotions visions you see their relation to others emotions. It is not a question of choosing between them one at the cost of another but a matter of completion of inclusion an encompassing unifying and integrating which makes maturity. And yet emotional maturity is not something that happens unto us as a passive function of time. It is as Toni Morrison well knew a difficult beauty an intensely hard won glory the product of intentional character-sculpting the slow and systematic chiseling away of our childish impulses for tantrums for sulking for instant self-gratification without regard for others for weaponizing our feelings of shame frustration and loneliness. Like happiness another life-skill we have miscategorized as a passive abstraction it requires early education consistent relearning and unrelenting practice. That is what Alain de Botton one of our eras most uncommonly perceptive lyrical and lucid existential contemplatives offers in The School of Life: An Emotional Education ( public library ) the book companion to his wonderful global academy for self-refinement a decade in the making. De Botton considers the type of learning with which the road to emotional maturity is paved: The knack of our species lies in our capacity to transmit our accumulated knowledge down the generations. The slowest among us can in a few hours pick up ideas that it took a few rare geniuses a lifetime to acquire. Yet what is distinctive is just how selective we are about the topics we deem it possible to educate ourselves in . Our energies are overwhelmingly directed toward material scientific and technical subjects and away from psychological and emotional ones. Much anxiety surrounds the question of how good the next generation will be at math; very little around their abilities at marriage or kindness. We devote inordinate hours to learning about tectonic plates and cloud formations and relatively few fathoming shame and rage. The assumption is that emotional insight might be either unnecessary or in essence unteachable lying beyond reason or method an unreproducible phenomenon best abandoned to individual instinct and intuition. We are left to find our own path around our unfeasibly complicated minds a move as striking (and as wise) as suggesting that each generation should rediscover the laws of physics by themselves. This irrational orientation to our emotional lives De Botton argues is our inheritance from the Romantics who crowned the untrained intuition the supreme governing body of human conduct. (And yet the Romantics contained multitudes for all their belief in the unalterable givenness of emotional reality and the fidelity of feeling they had a glimmering recognition that reason must be consciously applied to reining in the wildness of the emotions. Mary Shelley offspring of the greatest power couple of political philosophy placed at the heart of Frankenstein one of the most prescient and psychologically insightful works of literature ever composed triply so for being the work of an eighteen-year-old girl an admonition against the unbridled reign of the egos emotional cravings unchecked by reason and forethought of consequence.) Exception aside De Bottons broader point is excellent: The results of a Romantic philosophy are everywhere to see: exponential progress in the material and technological fields combined with perplexing stasis in the psychological one. We are as clever with our machines and technologies as we are simple-minded in the management of our emotions. We are in terms of wisdom little more advanced than the ancient Sumerians or the Picts. We have the technology of an advanced civilization balancing precariously on an emotional base that has not developed much since we dwelt in caves. We have the appetites and destructive furies of primitive primates who have come into possession of thermonuclear warheads. In 1983 the psychologist Howard Gardner devised his seminal theory of multiple intelligences expanding our narrow cultural definition of intelligence as verbal and mathematical skill to include seven other modes of intellectual ability. A decade later Daniel Goleman added a tenth form of intelligence emotional intelligence which quickly permeated the fabric of popular culture as hoards of humans felt suddenly recognized in an endowment long neglected as a valuable or even extant faculty of consciousness. Building on that legacy De Botton brings his own sensitive perspicacity to a richer more dimensional definition: The emotionally intelligent person knows that love is a skill not a feeling and will require trust vulnerability generosity humor sexual understanding and selective resignation. The emotionally intelligent person awards themselves the time to determine what gives their working life meaning and has the confidence and tenacity to try to find an accommodation between their inner priorities and the demands of the world. The emotionally intelligent person knows how to hope and be grateful while remaining steadfast before the essentially tragic structure of existence. The emotionally intelligent person knows that they will only ever be mentally healthy in a few areas and at certain moments but is committed to fathoming their inadequacies and warning others of them in good time with apology and charm There are few catastrophes in our own lives or in those of nations that do not ultimately have their origins in emotional ignorance. De Botton is careful to acknowledge that this line of inquiry might trigger the modern intellectual allergy to the genre of learning dismissively labeled self-help . And yet he reminds us that the quest for self-refinement has always accompanied the human experience and animated each civilizations most respected intellects it is there at the heart of the Stoics and in the essays of Montaigne and at the center of Zen Buddhism and in the literary artistry of Proust (whom De Botton has especially embraced as a fount of existential consolation). He aims a spear of simple logic to the irrational and rather hubristic disdain for self-help: To dismiss the idea that underpins self-help that one might at points stand in urgent need of solace and emotional education seems an austerely perverse prejudice. Our cultural failure at making emotional intelligence an educable thing De Botton argues stems from two flawed baseline assumptions of our education system itself its focus on what people are taught over how they are taught and its tendency to mistake information for wisdom . (Adrienne Rich shone a sidewise gleam on these flaws and their remedy in her superb 1977 convocation address about why an education is something you claim not something you get .) De Botton envisions the emotionally enlightened alternative: An emotional education may require us to adopt two different starting points. For a start how we are taught may matter inordinately because we have ingrained tendencies to shut our ears to all the major truths about our deeper selves. Our settled impulse is to blame anyone who lays our blind spots and insufficiencies bare unless our defenses have first been adroitly and seductively appeased. In the face of critically important insights we get distracted proud or fidgety. We may prefer to do almost anything other than take in information that could save us. Moreover we forget almost everything. Our memories are sieves not robust buckets. What seemed a convincing call to action at 8 a.m. will be nothing more than a dim recollection by midday and an indecipherable contrail in our cloudy minds by evening. Our enthusiasms and resolutions can be counted upon to fade like the stars at dawn. Nothing much sticks. It was the philosophers of ancient Greece who first identified these problems and described the structural deficiencies of our minds with a special term. They proposed that we suffer from akrasia commonly translated as weakness of will a habit of not listening to what we accept should be heard and a failure to act upon what we know is right. It is because of akrasia that crucial information is frequently lodged in our minds without being active in them and it is because of akrasia that we often both understand what we should do and resolutely omit to do it. How to overcome akrasia and live with life-enlarging emotional intelligence by absorbing the beauty and wisdom encoded in literature and art by harnessing the power of ritual by undertaking the difficult immensely rewarding and redemptive work of self-knowledge is what De Botton offers in the remainder of the thoroughly helpful The School of Life: An Emotional Education . Complement this small prefatory excerpt with philosopher Martha Nussbaum on the intelligence of emotions then revisit De Botton on what makes a good communicator the psychological paradox of sulking and his lovely letter to children about why we read . Each month I spend hundreds of hours and thousands of dollars keeping The Marginalian going. For seventeen years it has remained free and ad-free and alive thanks to patronage from readers. I have no staff no interns not even an assistant a thoroughly one-woman labor of love that is also my life and my livelihood. If this labor has made your own life more livable in the past year (or the past decade) please consider aiding its sustenance with a one-time or loyal donation. Your support makes all the difference. $3 / month $5 / month $7 / month $10 / month $25 / month You can also become a spontaneous supporter with a one-time donation in any amount: Partial to Bitcoin? You can beam some bit-love my way: 197usDS6AsL9wDKxtGM6xaWjmR5ejgqem7 Need to cancel an existing donation? (It's okay life changes course. I treasure your kindness and appreciate your support for as long as it lasted.) You can do so on thispage . The Marginalian has a free Sunday digest of the week's most mind-broadening and heart-lifting reflections spanning art science poetry philosophy and other tendrils of our search for truth beauty meaning and creative vitality. Here's an example . Like? Claim yours: Also: Because The Marginalian is well into its second decade and because I write primarily about ideas of timeless nourishment each Wednesday I dive into the archive and resurface from among the thousands of essays one worth resavoring. Subscribe to this free midweek pick-me-up for heart mind and spirit below it is separate from the standard Sunday digest of new pieces: Published November 25 2019 https://www.themarginalian.org/2019/11/25/the-school-of-life-book/ The Marginalian participates in the Amazon Services LLC Associates Program an affiliate advertising program designed to provide a means for sites to earn commissions by linking to Amazon. In more human terms this means that whenever you buy a book on Amazon from any link on here I receive a small percentage of its price which goes straight back into my own colossal biblioexpenses. Privacy policy . (TLDR: You're safe there are no nefarious third parties lurking on my watch or shedding crumbs of the cookies the rest of the internet uses.)
753
BAD
Alexander the Great in Firdawsi's Book of Kings (blogs.bl.uk) Main Previous post Next post 24 October 2022 The legendary life of Alexander the Great is the subject of the British Librarys new exhibition Alexander the Great: The Making of a Myth which opened on Friday 21 October. A visual feast of stories spanning more than 2000 years it centres on the Alexander Romance originally composed in Greek around the third century AD and shares narratives from East and West side by side in more than twenty languages. One of the most richly illustrated sources is the Persian Shahnamah (Book of Kings) completed by the poet Firdawsi in 1010 AD. There are no less than fourteen copies of this national epic in the exhibition ranging from the beginning of the fourteenth to the mid-nineteenth centuries. A selection of these is shown below. While the details sometimes differ from the Alexander Romance the Shahnamah tells the same stories in a common context. The history of Alexander in the Shahnamah begins with a peace treaty between King Darab of Persia and Filqus (Philip) of Greece in which Filqus daughter Nahid is married to the Persian king. Though she outshone all others in her beauty she proved to suffer from bad breath and the marriage broke down irretrievably. Nahid was sent home to her father rejected but unknowingly pregnant with Alexander (Iskandar or Sikandar in Persian) who was subsequently brought up as Philips son and heir. Meanwhile King Darab took another wife who gave birth to Dara (Darius) who would succeed his father before being ultimately defeated in battle by his half-brother Alexander. The death of Dara one of the most frequently illustrated subjects in the Shahnamah in a hand-coloured lithograph edition published in Bombay in 1849. British Library 14807.h.4 Three decisive battles took place in the last of which at Gaugamela in northern Iraq in 331 BC the Persian army was irrevocably defeated. Dara escaped but was mortally wounded by two of his own men. Iskandar who had wanted Dara alive was dismayed when he found him. Cradling Daras head on his knees he promised to fulfil Daras dying wishes: to look after his family to marry his daughter Roshanak (Roxana) and to safeguard the Zoroastrian religion. Indeed Iskandar married Roshanak with much pomp and ceremony and then moved on to India where he conquered King Kayd of Hind by peaceful means. As part of their agreement he received four gifts: King Kayds daughter in marriage his all-knowing seer to advise him his physician who could cure any disease and his never-emptying goblet. Iskandar marries the daughter of King Kayd of Hind (India). Sultanate India 1438. British Library Or.1403 f. 318r Moving on Iskandar challenged another Indian ruler King Fur (Porus). On this occasion a fierce battle took place. Iskandar had been forewarned about Furs invincible army of elephants and to counter them recruited more than 1200 blacksmiths who forged 1000 iron horses and riders on wheels. These were filled with oil and set alight at the head of the advancing army. The whole army was put to rout leaving Iskandar to kill Fur in single combat. The battle between Iskandar and Fur. Artist: Kamal Mughal India about 1616. British Library Add MS 5600 f. 361v From India Iskandar travelled in search of riches and new experiences. He went to Mecca Egypt and Andalus in this context most likely a city in western Asia representing the West where he encountered Queen Qaydafah (Candace in the Greek Alexander Romance ). Iskandar approached her court disguised as a messenger but she already had a portrait of him and so immediately recognised him. His deception exposed Iskandar feared for his life but instead was admonished and sent safely on his way. A similarly peaceful encounter took place with the Amazons the virgin warriors of Harum located in the Caucasus. Iskandars peaceful visit to the woman-only city of Harum. Iran 1536. British Library Add MS 15531 f. 345r Travelling further Iskandar encountered the philosophical Brahmans people with heads on their chests and without bones dragons and all manner of mythical creatures. He fought battles in China and against the Russians and constructed a wall to contain the barbarous peoples of Gog and Magog. Iskandar kills a dragon by feeding it cow-hides stuffed with poison and oil. Isfahan 1614. British Library Add MS 16761 f. 190v Increasingly however Iskandar became pre-occupied with his own mortality. Would he ever see his native land again when would he die? Seeking but never finding the waters of everlasting life he met the angel of death Israfil who told him his time would come. Then at the edge of the world he came to the talking tree which had two trunks one male and one female. At midday the male trunk spoke foretelling the end of his fourteen-year rule and at nightfall its female counterpart announced: Death will come soon. Iskandar and the talking tree. Shiraz c . 1420-25. The Bodleian Libraries University of Oxford MS Ouseley Add. 176 f. 311v Now at the end of his journey Iskandar returned to Babylon where he was greeted with more omens of death: a stillborn child with a lions head a human chest and shoulders and a cows tail and hooves. Iskandar sees an omen of his imminent death in Babylon. Iran c. 1300. Chester Beatty Dublin Per 104.49 . His fate sealed Iskandar fell ill that very day. He ordered that he should be carried outside and in full view of his soldiers he advised them to live humbly and follow his example. As depicted below a physician takes his pulse while another is making notes. In the background courtiers and soldiers wipe away their tears. The moment of Iskandars death. Qazvin 1585-6. British Library Add. MS 27302 f. 414r Alexander the Great: The Making of a Myth runs until 19 February 2023. It is accompanied by a book of the same title . Edited by Richard Stoneman it includes nine essays by leading scholars together with images and descriptions of the exhibition items. Tickets are on sale and may be booked on our Events page and more information can be found on our dedicated exhibition website . We are indebted to the Kusuma Trust the Patricia G. and Jonathan S. England British Library Innovation Fund and Ubisoft for their support towards the exhibition as well as other trusts and private donors. Ursula Sims-Williams Lead Curator Persian British Library Further Reading Firdawsi Shahnamah trans. Dick Davis Shahnameh: The Persian Book of Kings rev. edn. New York: Penguin 2016. The Shahnama of Firdausi trans. Arthur George and Edmond Warner vol 6 . London: Routledge 1912. Manteghi Haila Alexander the Great in the Persian Tradition: History Myth and Legend in Medieval Iran. London and New York: I. B. Tauris 2018. Posted by Ursula Sims-Williams at 8:49 AM Tags Alexander exhibition Exhibitions Iran Middle East Mughal India South Asia More... All text is British Library Board and is available under a CC-BY Licence except where otherwise stated
764
BAD
Alien was released 44 years ago today https://twitter.com/ATRightMovies/status/1661664750273527810 tosh Weve detected that JavaScript is disabled in this browser. Please enable JavaScript or switch to a supported browser to continue using twitter.com. You can see a list of supported browsers in our Help Center. Help Center Terms of Service Privacy Policy Cookie Policy Imprint Ads info 2023 X Corp.
null
BAD
All companies are fucked up (jonpauluritis.com) Belts in Karate are ordered from white to black. This is of course a new invention. Originally new colored belts were not given upon arbitrary levels of attainment a white belt was given at the beginning of one's training and you just never washed it. Over the years the belt would gradually turn from white to yellow to orange to red. And if you trained hard enough eventually you'd go black. All companies are fucked up. There I said it you're welcome. One restaurant I worked for had a tremendously talented chef who was a straight-up alcoholic. Not the kind of alcoholic where the guy just gets plastered every weekend and is constantly nursing a hangover... the kind where he needed 2 shots and a beer before the dinner rush or he would cut off his fingers because of the shakes. I'm sure you're asking how did he manage to avoid getting fired if he was plastered every shift? Turns out the restaurant was a great place for him- we were basically allowed to drink on the job for free as much as we wanted. Hell it was debatably our competitive advantage because everyone in the restaurant liked to drink... and we had this tremendously talented staff that treated the place like a party where you got paid... the owner made a lot more money for it too. Fucked up right? ... well all companies are fucked up . I worked an office gig one time where my boss just stopped telling us what to do. We were just supposed to know. For some people this probably sounds amazing right? No corporate politics. No bullshit meetings. Just do whatever you feel like! It was great for half a heartbeat but then it turned into Lord of the Flies ... the inmates running the asylum sort of thing. Of course some of the people at work LOVED it. They showed up every day just ecstatic that they were getting paid to do nothing. Fucked up right? ... well all companies are fucked up! But this was surely less fucked up than the restaurant above? Nope not for me. I'm a high performer. I like competing and winning. Doing nothing feels like getting my fingernails ripped off one by one. It's masturbating with sandpaper or getting impaled by a pineapple. But surely if it's YOUR company and you are calling the shots it won't be fucked up right? lol. All companies are fucked up ... the trick is finding a company that's fucked up in a way that works for you. It's kind of like finding a spouse or where to live - don't live in California if earthquakes freak you out and if you're a misanthrope Manhattan is probably a bad place to live. If you're trying to figure out how do I find the company that's fucked up in the way that works for me? here's my advice: Do whatever you are into as fast and with as much focus as you can. Learn as much as you can about the stuff that you care about and put yourself around people that are doing the same thing... or put another way: Turn your belt black. Eventually you'll end up with people who are fucked up in the same ways that you are fucked up... and by the transitive property you'll find yourself at a company that's fucked up in its own special way just the way that you like. If you liked the article you should check out my monthly newsletter Cocktail Napkin Math: 2020. All Rights Reserved.
790
BAD
Almost everything on Amazon is becoming an ad (washingtonpost.com) Sign in A previous version of this article incorrectly reported that the Highly rated label could be purchased. In fact Amazon selects who is featured under that label. This article has been corrected. Amazon is the first app many of us think about to buy things online. But is it actually a good place to go shopping? When you search for a product on Amazon you may not realize that most of what you see at first is advertising. Amazon is betraying your trust in its results to make an extra buck. Let me show you. Well search Amazon for cat beds. Here are the results. Next well put an orange highlight on all of the ads. Believe it or not thats pretty much everything you see. Theres an ad for a brand called Bodiseint at the top. Underneath are three results that paid their way to the top of the cat beds listing. Theyre not even very relevant: On the left is a product featuring a photo of dog yes a dog for one of Amazons own brands. On the right is a luxury cat condo that costs $389. Scrolling to the second screen we finally start to see non-ads. These are the first products that were actually chosen because theyve got the best combination of price and quality. But the real results dont last long. Scroll to the next screen and its all ads again. Heres a set of listings labeled Highly rated but dont be fooled: These arent the highest-rated cat beds on Amazon. These are also just ads. Scroll again and this screen has even more ads. These three under the heading Top rated from our brands are all for Amazons own products. (Wait why is there another dog photo?) Keep on scrolling and the ads keep coming even if theyre repeats. On these first five screens more than 50 percent of the space was dedicated to ads and Amazon touting its own products. This isnt just a cat problem: The first page of Amazon results includes an average of about nine sponsored listings according to a study of 70 search terms conducted in 2020 and 2021 by data firm Profitero. That was twice as many ads as Walmart displayed and four times as many as Target. Amazon might feel unbeatable for service fast shipping and easy returns. But as a place to find products its becoming a tacky strip mall filled with neon signs pointing you in all the wrong directions. Amazon founder Jeff Bezos owns The Washington Post but I review all tech with the same critical eye. One of the great promises of the internet is that we can get access to more information and make better choices. Amazon has pioneered a kind of online advertising business that feeds us sponsored information that can cloud our choices. We the users want honest online shopping experiences and some common-sense limits on ads that are designed to deceive. I call it the shill results business. Even when they contain a tiny disclaimer label as do Amazons these kinds of ads can be misleading because they fill up spaces people have every reason to expect to contain trustworthy independent information. Whats worse many other apps and online marketplaces are following Amazons lead. Shill results now crowd Apple and Googles smartphone app stores search for an app used for couples therapy and youll get an ad for a dating app. ( Seriously .) Food-delivery apps shill eggnog and whipped cream when youre just looking for milk. As Ive written before Google search has a big shill results problem too though its as much about ads as pointing you back to Googles own products. Amazon has turned shill results into its next big thing. After selling $31 billion in ads last year Amazon became the third-largest online ad company in the United States trailing only Google and Facebook. Some brands and sellers love Amazon ads because they show up right at the moment youre making a purchase though others tell me ads have become an extra Amazon tax they have to pass on to customers. Amazon insists theyre actually a good thing for us. We are dedicated to providing customers with a world-class shopping experience including working hard every day to ensure the ads they see are useful informative and help make shopping a little bit easier said spokesman Patrick Graham. But in my experience Amazons ads are often not useful not informative and can make shopping a little bit harder. If you are searching for a cat bed you have an expectation that Amazon will show you the cat beds that are most useful for you. Not $389 cat beds. Not the pet bed Amazon makes the most money from. Not a weird knockoff. Let me be clear: Advertising isnt necessarily bad. When its done well ads can inform us about new products and help new businesses get a foot in the door. It pays the bills for much of the internet including this news website. Right now consumers are tolerating ads pretty well overall on Amazon despite the number of them said Andrew Lipsman a principal analyst at market research firm Insider Intelligence. But he warns there could be a tipping point: There is a very clear tension between advertising and customer experience. Amazon told me an internal study found 89 percent of U.S. customers are pleased with results pages. I would like to invite them to run the survey again after showing customers their results with all my orange highlights on the ads. The Amazon we experience today is pretty much the opposite of how Amazon used to work. Even as recently as 2015 Amazons results pages were filled with actual results ranked by relevancy to your search. I found an archive of pages and marked one up compared with the same search today. What happened? Back in the 2000s when we started learning to buy all kinds of things online Amazon was investing heavily in a new kind of shopping science: personalization and recommendations. Amazons mission was to marry up everything it knew about its products with everything it knew about you to help you make the best choices. The store radically changes based on customer interests showing programming titles to a software engineer and baby toys to a new mother Amazon researchers wrote in an academic paper published in 2003. This is probably how most of us imagine Amazon still works. But today advertisers are driving the experience. Amazons focus has shifted from trying to find ways to delight consumers with great recommendations personalization and discovery to building better advertising technology says industry analyst Juozas Kaziukenas of research firm Marketplace Pulse who has written about how everything on Amazon is becoming an ad . Amazon also now uses search results to push its own in-house products. An investigation from The Markup exposed how Amazon results list its own brand and exclusive products ahead of others with higher ratings. Sure Google and Facebook are chock full of ads too. But on Amazon were supposed to be the customers not the eyeballs for sale. Were paying Amazon to buy a product not to mention probably also paying for a membership in its Prime two-day-shipping product. The reason they get away with it is that busy shoppers cant easily detect how theyre now promoting the products that are best for Amazons bottom line. Consumers can either ignore ads or assume that the advertised product is good enough said Kaziukenas. Part of the reason why those ads on Amazon and Google work so well is because its near impossible for them to perfectly determine the best search results. Amazon says it still requires ads to be relevant. We know that advertiser and customer interests are inherently aligned said Graham the spokesman. Advertising only works if we make it useful for Amazon customers and when we create great customer experiences we deliver better outcomes for brands. But are Amazons ads really always relevant? In some cases I would argue theyre actually deceptive. First Amazon lets advertisers do whats called brand conquesting. Off-brands can pay to advertise under a major brands name. When I search for a KitchenAid mixer my first screen of results is brands called Kuppet and Kuccu. It does this even though we its supposed customers take time to type out the name of a brand. Amazons Graham said This practice is good for customers it drives discovery and presents them with more choices. Second Amazon search pages can contain blocks of information that arent nearly as independent as they might sound. With headings such as Highly rated they sound like helpful call-outs of the best products but they dont actually contain the highest-rated products Amazon sells. While Amazon makes the distinction advertisers cant specifically pay for the Highly rated label the section is often stacked with sponsored listings that dont have terrible customer reviews. So how can we the users take back control? First we can change our own behavior. Im under no illusion that were all going to stop shopping on Amazon; with its monopoly power its getting hard to go elsewhere. But now that Im aware Amazon is playing games I start my shopping on Google and trusted reviews sites and then head over to Amazon only once Ive identified what I want. We can also learn the subtle ways Amazon hides whats really an ad so you can make your own imaginary orange highlights. I annotated this rogues gallery with some of its most common formats. LEFT: Search results are no longer ordered by relevance or what Amazon thinks the best product is. When the placement is paid it should have a Sponsored label like this. (Amazon/Washington Post illustration) RIGHT: Amazon places its own-brand products in search results and labels them not as Sponsored but rather From our brands. (Amazon/Washington Post illustration) LEFT: The Highly rated category in Amazon listings does not mean highest rated. It can be filled with ads so look for the Sponsored label in smaller type underneath. (Amazon/Washington Post illustration) RIGHT: Ads for specific brands often sit on top of Amazon results and they could be for a competitor to the brand you searched for. Look for the Sponsored label in the lower right. (Amazon/Washington Post illustration) Look for the Sponsored label but not always in the same place. Sometimes its hidden in the lower-right corner; other times its in tiny type above the product name. When the ad is for one of Amazons own products the listing might say just Featured from our brands. (Amazon doesnt consider this an ad.) Also be on the lookout for those boxes of ads designed to look like independent information. They come under many headings including Brands related to your search Highly rated Trending now and Customers frequently viewed. The company told me it is continually testing new groupings and iterating on titles. (What about the Amazons choice label? Thats determined by an Amazon algorithm for a product that has good reviews is well-priced and is available to ship. The label cant be purchased but it can appear on a sponsored listing if the product meets Amazons criteria. Amazon says the Best seller Climate pledge friendly and Parent pick labels also cannot be purchased.) But this cant be all on us. Amazon and all the other sites and apps following its lead need some common-sense limits. Heres a modest proposal: No more than half of any screen we see at any given time be it on desktop web or a smartphone should contain ads. Perhaps 50 percent sounds like a lot to you? But even that rule would force Amazon to show us at least some of the most-relevant results on the first screen of our device. Amazon wouldnt comment on this suggestion. Another idea: Shill results should be much more clearly marked. A label disclosing that a shill listing is Sponsored should have the same font size and contrast as the most prominent text in the ad. Even better: It should have to go on the top-left part of the ad where our eyes go first. No more burying it in the far-right corner. Amazons Graham said Ads in Amazons store always include a clear and prominent sponsored label implemented in accordance with FTC [Federal Trade Commission] guidelines. But not everyone agrees. Last year the FTC received a formal petition from the Strategic Organizing Center a coalition of labor unions complaining that Amazon misleads consumers because of how it labels sponsored results. Whats more the FTCs guidelines on all of this havent been updated since rapper Macklemore was at the top of the charts. In 2013 the FTC sent a letter to Google and others about what counts as acceptable ad labeling for search engines and then it posted an enforcement policy in 2015. Back then it had no way to anticipate all the ways Amazon would try to stretch the reality of what is and isnt an ad when were shopping online. Editing by Laura Stevens Karly Domb Sadof and Julie Vitkovskaya. Copy editing by Carey Biron. Design editing by Junne Alcantara. Photo editing by Monique Woo. Design and development by Emma Kumer. Help Desk is a destination built for readers looking to better understand and take control of the technology used in everyday life. Take control: Sign up for The Tech Friend newsletter to get straight talk and advice on how to make your tech a force for good. Tech tips for make your life easier: 10 tips and tricks to customize iOS 16 | 5 tips to make your gadget batteries last longer | How to get back control of a hacked social media account | How to avoid falling for and spreading misinformation online Data and Privacy: A guide to every privacy setting you should change now . We have gone through the settings for the most popular (and problematic) services to give you recommendations. Google | Amazon | Facebook | Venmo | Apple | Android Ask a question: Send the Help Desk your personal technology questions .
806
BAD
Almost three quarters of the golden age of Hollywood has been lost (historytoday.com) Subscription Offers Give a Gift Subscribe Almost three quarters of the golden age of Hollywood has been lost. Preservation only began when film came to be seen as art. Fred Otts Sneeze an early kinetoscopic film produced by the Edison Manufacturing Company 1894. Alamy. On 3 November 1927 Hollywood stars were out in force for the premiere of the silent film The Devil Dancer . Directed by Fred Niblo who had recently wowed audiences with Ben-Hur: A Tale of the Christ (1925) it was a spectacular tale of romance and derring-do set high in the mountains of Tibet. It was an instant hit. The critics loved it. The New York Times gushed over its rich scenery and wonderfully convincing atmosphere. Everyone agreed it was in a class of its own. At the first Academy Awards the following year it was nominated for Best Cinematography and was only pipped at the post by F.W. Murnaus ground-breaking Sunrise . Yet just as its place in film history seemed assured it suddenly disappeared. No one could say when or even how it was lost. All we know is that not a single frame of it is left. When The Devil Dancer was released American silent films were at the peak of their popularity. In 1917 alone almost 1000 films were released: an average of nearly three per day. Their success was phenomenal. By the mid-1920s an average of 46 million cinema admissions were recorded per week out of a total population of 116 million. Film stars enjoyed an unheard-of celebrity. In 1926 Colleen Moore whose feisty manner helped define the flapper style was receiving 10000 fan letters every week and was earning upwards of $40000 per month the equivalent of well over $650000 today. As the psychologist William Moulton Marston remarked [n]ot even the church had such a hold over the popular imagination. Yet sadly The Devil Dancer s fate was not unusual. For all their popularity silent films were alarmingly vulnerable. During the golden age of the silent movie (1912-29) 10919 silent feature films of American origin are known to have been released in the US. Of these only 2749 (25.2 per cent) survive as complete films either in their original 35mm domestic release version or in some other format. A further 562 (5.1 per cent) are incomplete lacking one or more reels. The remaining 8114 (74.3 per cent) have been lost a staggeringly high proportion. There was no pattern to the losses. It didnt matter whether a film was a success or a failure. Blockbusters were just as likely to disappear as flops. In fact many of the periods greatest hits are missing. Annette Kellermans million-dollar movie A Daughter of the Gods (1916) which was filmed on location in Jamaica and which featured the first nude scene by a major actress has been lost without a trace. Nor was genre a factor. Popular Westerns like The Phantom Riders (1918) vanished just as often as comedies and experimental pieces. Even star power was no guarantee of survival. Popularly known as The Vamp Theda Bara was one of cinemas earliest sex symbols earning $4000 per week in her prime; yet only two of her 39 films have come down to us. So why were so many films lost? One reason is that there were never more than a handful of copies to start with. Today Hollywood films tend to be released in a large number of cinemas simultaneously meaning that a lot of prints are in circulation at any given time. Quite the opposite was true of silent movies. Early production companies based their business model on building excitement through scarcity. Rather than being released all at once films were released in stages. First they would be shown in downtown cinemas then in the suburbs and finally in more rural areas. This way tickets for earlier screenings could be sold at a higher price and a steady interest maintained across the country. Sometimes it would take as long as two years for a film to reach the whole country. The effect was to reduce the number of copies that were needed even for the most successful titles. And the fewer prints there were the more likely films were to be lost mislaid or destroyed. It didnt help that early films were also very fragile. Until as late as 1951 the film base of most movies was made from cellulose nitrate. This had the advantage of being colourless transparent and flexible but it was also highly unstable. Over time the base reacts with the air to produce nitric acid a highly corrosive substance which gradually eats away at the film. First the image fades then a sticky gloop begins to form and finally the whole reel disintegrates. With the proper care this process can often be slowed; but once it has started it cant be stopped with the result that many silent films simply turned to dust sometimes before completing a single screening run. To make matters worse cellulose nitrate is highly flammable too. During screenings it was not unknown for a film to get too hot passing through the projector and burst into flames. But films could also ignite even when in storage. On 9 July 1937 during an especially hot summer a fire broke out in a 20th Century Fox vault in New Jersey. It took 150 firefighters more than three hours to extinguish the blaze. Among the thousands of films destroyed were nearly all those directed by Gordon Edwards and most of the Westerns starring Tom Mix. Almost as devastating was the fire which broke out at the MGM vault in Culver City California on 10 August 1965. Caused by an electrical short this claimed the only known copies of several hundred titles including The Divine Woman (1928) starring Greta Garbo and Tod Brownings horror mystery London After Midnight (1927). The main reason so many silent films were lost however is that almost no one thought they were worth saving. For most film-goers silent feature films were fun even exciting but never anything more than ephemeral. As soon as youd seen one youd forget it and move on to the next. They certainly werent art. As the Los Angeles Times critic Edwin Shallert put it: Making pictures is not like writing literature or composing music or painting masterpieces. The screen story is essentially a thing of today and once it has had its run that day is finished. So far there has never been a classic film in the sense that there is a classic novel or poem or canvas or sonata. Last years picture however strong its appeal at the time is a book that has gone out of circulation. Since novelty was clearly the key to success studios saw no point in holding onto their old titles. It was expensive to store films properly reruns were rare and after the talkies came along in 1929 there wasnt even any scope to reuse old footage in new productions. It made no commercial sense to cling onto such useless reels. As the movie mogul Samuel Goldwyn explained: I cannot rest on the laurels of the past. Whenever a film became too worn or started to decompose studios simply wrote it off. Either they chucked it straight in the bin or they tried to recover what money they could by reclaiming the silver content of the base. Several companies specialised in this and if done on a large enough scale it could yield a decent sum. As the film historian David Pierce has noted in one case: United Artists sent 130 well-worn prints of Suds (1920) Little Lord Fauntleroy (1921) Rosita (1923) and other older [Mary] Pickford titles to the Kodak recovery centre in Rochester. The resulting income was a modest but undoubtedly welcome $302.74. The cost to film history was incalculable however. It took a long time for anyone to start talking seriously about preservation due largely to the unwillingness to see film as an art form. When the idea was first mooted back in 1893 the goal had merely been to secure copyright protection. It was suggested that copies should be deposited in the Library of Congress not so that they could be seen again much less appreciated but for the sake of guarding against imitation. And even this was slow to take off. Not until a little later did anyone raise the possibility of preserving films for their own sake. In 1895 W.K.L. Dickson proposed that a national film collection should be established. He pictured this as a gigantic visual library where people might view films as easily as they read books. But it was clear that what he had in mind were documentary films (newsreels footage of eminent people events etc.) rather than fictional narratives. Trusting naively to the cameras fidelity he wanted to create a repository capable of preserving history free of the historians cant and with greater precision than written texts (!) not a collection of aesthetically or dramatically valuable films. The release of D.W. Griffiths The Birth of a Nation (1915) marked a turning point of sorts. Though now reviled for its unrepentant racism this epic Civil War drama caused an immediate sensation and became the most popular film of its day earning a record $20 million in receipts. Its success rested to a large degree on its technical virtuosity. It pioneered the use of closeups and fadeouts featured hundreds of extras and was the first film to be issued with its own score. For many critics it illustrated the artistic potential of cinema and led some to query films status. Granted it didnt cause studios or even the government to think any differently about preservation. Though politicians like the former postmaster general Will Hays regularly called for the creation of a national film archive in the 1920s they too were interested only in films showing historical or otherwise noteworthy event[s]. But such was the visual impact of The Birth of a Nation that it nevertheless stirred private enthusiasts into action instead. Later that year Columbia University opened its first film programme and began assembling its own movie collection. Meanwhile film societies began springing up especially in Europe. Small groups would gather regularly to watch and discuss films and many created their own archives. Naturally these tended to be fairly eclectic. Yet they helped to foster an appreciation of cinemas artistic elements among intellectuals and journalists at a time when most studios and filmgoers were oblivious to all but its most commercial aspects. This proved decisive. Thanks to critics such as Iris Barry the notion that silent films deserved not only to be recognised as artworks in their own right but also preserved for posterity began to gather momentum. When the Museum of Modern Art was founded in New York in 1929 it was recognised that film deserved a place in its collection. Accordingly in 1935 The Film Library was established under Barrys leadership to trace catalog assemble exhibit and circulate films in exactly the same manner as paintings sculpture architectural photographs and reproductions. This transformed attitudes so much so that in 1960 MGM began an ambitious programme to preserve any silent films still in its vaults; and in 1965 the American Film Institute was established specifically to safeguard the legacy of the American film industry. It is largely thanks to these institutions and the European film societies that any silent films from Americas golden age have survived at all. It is just a pity they did not start earlier. Looking back at what has perished it is hard not to think of Sunset Boulevard (1950) and weep with Norma Desmond if not for the demise of the silent movie star then certainly for the loss of so many silent movies. Alexander Lee is a fellow in the Centre for the Study of the Renaissance at the University of Warwick. His latest book Machiavelli: His Life and Times is now available in paperback. Copyright 2023 History Today Ltd. Company no. 1556332.
808
GOOD
Alpaca-LoRA with Docker (github.com/chris-alexiuk) Instruct-tune LLaMA on consumer hardware Use Git or checkout with SVN using the web URL. Work fast with our official CLI. Learn more about the CLI . Please sign in to use Codespaces. If nothing happens download GitHub Desktop and try again. If nothing happens download GitHub Desktop and try again. If nothing happens download Xcode and try again. Your codespace will open once ready. There was a problem preparing your codespace please try again. This repository contains code for reproducing the Stanford Alpaca results using low-rank adaptation (LoRA) . We provide an Instruct model of similar quality to text-davinci-003 that can run on a Raspberry Pi (for research) and the code is easily extended to the 13b 30b and 65b models. In addition to the training code which runs within hours on a single RTX 4090 we publish a script for downloading and inference on the foundation model and LoRA as well as the resulting LoRA weights themselves . To fine-tune cheaply and efficiently we use Hugging Face's PEFT as well as Tim Dettmers' bitsandbytes . Without hyperparameter tuning the LoRA model produces outputs comparable to the Stanford Alpaca model. (Please see the outputs included below.) Further tuning might be able to achieve better performance; I invite interested users to give it a try and report their results. Install dependencies If bitsandbytes doesn't work install it from source. Windows users can follow these instructions . This file contains a straightforward application of PEFT to the LLaMA model as well as some code related to prompt construction and tokenization. PRs adapting this code to support larger models are always welcome. Example usage: We can also tweak our hyperparameters: This file reads the foundation model from the Hugging Face model hub and the LoRA weights from tloen/alpaca-lora-7b and runs a Gradio interface for inference on a specified input. Users should treat this as example code for the use of the model and modify it as needed. Example usage: The most recent official Alpaca-LoRA adapter available at tloen/alpaca-lora-7b was trained on March 26 with the following command: These files contain scripts that merge the LoRA weights back into the base model for export to Hugging Face format and to PyTorch state_dicts . They should help users who want to run inference in projects like llama.cpp or alpaca.cpp . (optional) Change desired model and weights under environment in the docker-compose.yml Build and run the container Open https://localhost:7860 in the browser See logs: Instruction : Tell me about alpacas. Alpaca-LoRA : Alpacas are members of the camelid family and are native to the Andes Mountains of South America. They are known for their soft luxurious fleece which is used to make clothing blankets and other items. Alpacas are herbivores and graze on grasses and other plants. They are social animals and live in herds of up to 20 individuals. Stanford Alpaca : Alpacas are small fluffy animals related to camels and llamas. They are native to Peru and Bolivia and were first domesticated around 5000 years ago. They are kept mainly for their fine soft fleece which is used to make knitwear and other garments. Alpacas are herd animals and live in small family groups led by an older male. They are highly social animals and can form strong bonds with their owners. text-davinci-003 : Alpacas are a domesticated species of South American camelid similar to llamas. They are native to the Andes Mountains in South America and are kept as livestock for their soft luxurious wool. Alpacas are social animals and live in herds of up to 20 individuals. They are typically kept in small herds of two to five animals and are relatively easy to care for. Alpacas are herbivores and feed on grass hay and other vegetation. They are also known for their gentle and friendly nature making them popular as pets. Instruction : Tell me about the president of Mexico in 2019. Alpaca-LoRA : The president of Mexico in 2019 was Andrs Manuel Lpez Obrador who took office on December 1 2018. He is a member of the National Regeneration Movement (MORENA) political party and is the first left-wing president of Mexico since 1946. He is known for his anti-corruption and anti-neoliberal policies as well as his commitment to improving the living conditions of the Mexican people. Stanford Alpaca : In 2019 Andres Manuel Lopez Obrador was sworn in as President of Mexico. He is the countrys 58th president and the first from the left-wing Morena party. He was elected in 2018 on a promise to combat corruption and reduce poverty and has faced criticism for his handling of the economy crime and the countrys relationship with the United States. He is also known for his ambitious social welfare programs and has been praised for raising the minimum wage and providing aid to low-income families. He is seen as a champion of democracy and human rights in Mexico. text-davinci-003 : The president of Mexico in 2019 is Andrs Manuel Lpez Obrador commonly known as AMLO. He is a Mexican politician who has been the President of Mexico since December 1 2018. He is a member of the National Regeneration Movement (MORENA) party and previously served as the Head of Government of Mexico City from 2000 to 2005. He is a left-wing populist and has promised to reduce corruption and inequality in Mexico. He has also promised to increase public spending on social programs and infrastructure projects. Instruction : Tell me about the king of France in 2019. Alpaca-LoRA : The King of France in 2019 is Emmanuel Macron who was elected in May 2017. He is the youngest president in the history of the Fifth Republic and the first president to be born after World War II. He is also the first president to have never held elected office before. Stanford Alpaca : In 2019 the King of France was Emmanuel Macron. He was sworn in as President of France on May 14 2017 and has been leading the country ever since. He is a member of the centrist La Rpublique En Marche! Party. He is known for his reformist agenda and has faced a number of protests from both the right and left of the political spectrum. He is the youngest French president in history. text-davinci-003 : The King of France is a largely symbolic role as France is a republic. The current President of France is Emmanuel Macron who was elected in 2017. Instruction : List all Canadian provinces in alphabetical order. Alpaca-LoRA : Alberta British Columbia Manitoba New Brunswick Newfoundland and Labrador Nova Scotia Ontario Prince Edward Island Quebec Saskatchewan Stanford Alpaca : Alberta British Columbia Manitoba New Brunswick Newfoundland and Labrador Nova Scotia Ontario Prince Edward Island Quebec Saskatchewan Northwest Territories Nunavut Yukon. text-davinci-003 : Instruction : Write a Python program that prints the first 10 Fibonacci numbers. Alpaca-LoRA : Stanford Alpaca : text-davinci-003 : Instruction : Write a program that prints the numbers from 1 to 100. But for multiples of three print 'Fizz' instead of the number and for the multiples of five print 'Buzz'. For numbers which are multiples of both three and five print 'FizzBuzz'. Alpaca-LoRA : Stanford Alpaca : text-davinci-003 : Instruction : Tell me five words that rhyme with 'shock'. Alpaca-LoRA : Flock lock rock stock and sock. Stanford Alpaca : Five words that rhyme with shock are: rock pop shock cook and snock. text-davinci-003 : Instruction : Translate the sentence 'I have no mouth but I must scream' into Spanish. Alpaca-LoRA : No tengo boca pero tengo que gritar. Stanford Alpaca : No tengo boca pero debo gritar. text-davinci-003 : No tengo boca pero debo gritar. Instruct-tune LLaMA on consumer hardware
810
GOOD
Alpaca: A strong open-source instruction-following model (stanford.edu) Instruction-following models such as GPT-3.5 (text-davinci-003) ChatGPT Claude and Bing Chat have become increasingly powerful. Many users now interact with these models regularly and even use them for work. However despite their widespread deployment instruction-following models still have many deficiencies: they can generate false information propagate social stereotypes and produce toxic language. To make maximum progress on addressing these pressing problems it is important for the academic community to engage. Unfortunately doing research on instruction-following models in academia has been difficult as there is no easily accessible model that comes close in capabilities to closed-source models such as OpenAIs text-davinci-003. We are releasing our findings about an instruction-following language model dubbed Alpaca which is fine-tuned from Metas LLaMA 7B model. We train the Alpaca model on 52K instruction-following demonstrations generated in the style of self-instruct using text-davinci-003. On the self-instruct evaluation set Alpaca shows many behaviors similar to OpenAIs text-davinci-003 but is also surprisingly small and easy/cheap to reproduce. We are releasing our training recipe and data and intend to release the model weights in the future. We are also hosting an interactive demo to enable the research community to better understand the behavior of Alpaca. Interaction can expose unexpected capabilities and failures which will guide us for the future evaluation of these models. We also encourage users to report any concerning behaviors in our web demo so that we can better understand and mitigate these behaviors. As any release carries risks we discuss our thought process for this open release later in this blog post. We emphasize that Alpaca is intended only for academic research and any commercial use is prohibited . There are three factors in this decision: First Alpaca is based on LLaMA which has a non-commercial license so we necessarily inherit this decision. Second the instruction data is based on OpenAIs text-davinci-003 whose terms of use prohibit developing models that compete with OpenAI. Finally we have not designed adequate safety measures so Alpaca is not ready to be deployed for general use. There are two important challenges to training a high-quality instruction-following model under an academic budget: a strong pretrained language model and high-quality instruction-following data. The first challenge is addressed with the recent release of Metas new LLaMA models. For the second challenge the self-instruct paper suggests using an existing strong language model to automatically generate instruction data. In particular Alpaca is a language model fine-tuned using supervised learning from a LLaMA 7B model on 52K instruction-following demonstrations generated from OpenAIs text-davinci-003. The figure below illustrates how we obtained the Alpaca model. For the data we generated instruction-following demonstrations by building upon the self-instruct method. We started with the 175 human-written instruction-output pairs from the self-instruct seed set . We then prompted text-davinci-003 to generate more instructions using the seed set as in-context examples. We improved over the self-instruct method by simplifying the generation pipeline (see details in GitHub ) and significantly reduced the cost. Our data generation process results in 52K unique instructions and the corresponding outputs which costed less than $500 using the OpenAI API. Equipped with this instruction-following dataset we then fine-tuned the LLaMA models using Hugging Faces training framework taking advantage of techniques like Fully Sharded Data Parallel and mixed precision training. For our initial run fine-tuning a 7B LLaMA model took 3 hours on 8 80GB A100s which costs less than $100 on most cloud compute providers. We note that training efficiency can be improved to further reduce the cost. To evaluate Alpaca we conduct human evaluation (by the 5 student authors) on the inputs from the self-instruct evaluation set . This evaluation set was collected by the self-instruct authors and covers a diverse list of user-oriented instructions including email writing social media and productivity tools. We performed a blind pairwise comparison between text-davinci-003 and Alpaca 7B and we found that these two models have very similar performance: Alpaca wins 90 versus 89 comparisons against text-davinci-003. We were quite surprised by this result given the small model size and the modest amount of instruction following data. Besides leveraging this static evaluation set we have also been testing the Alpaca model interactively and found that Alpaca often behaves similarly to text-davinci-003 on a diverse set of inputs. We acknowledge that our evaluation may be limited in scale and diversity. So we are releasing an interactive demo of Alpaca and encourage readers to evaluate Alpaca themselves and give us feedback. In the rest of this section we include several interaction examples to showcase the capabilities and limitations of Alpaca. The above examples show that the outputs of Alpaca are generally well-written. We note that Alpaca reflects the general style of the instruction-following dataset. As a result Alpacas answers are typically shorter than ChatGPT reflecting text-davinci-003s shorter outputs. Alpaca also exhibits several common deficiencies of language models including hallucination toxicity and stereotypes. Hallucination in particular seems to be a common failure mode for Alpaca even compared to text-davinci-003. For example in the following figure Alpaca wrongly says that the Capital of Tanzania is Dar es Salaam which is the largest city in Tanzania. (It was the capital until 1974 when it was replaced by Dodoma.) Furthermore Alpaca can be used to generate well-written outputs that spread misinformation as seen in the following example. Alpaca likely contains many other limitations associated with both the underlying language model and the instruction tuning data. However we believe that the artifact will still be useful to the community as it provides a relatively lightweight model that serves as a basis to study important deficiencies. We encourage users to help us identify new kinds of failures by flagging them in the web demo. Overall we hope that the release of Alpaca can facilitate further research into instruction-following models and their alignment with human values. We are releasing the following assets today: We intend to release the following assets in the near future: We believe that releasing the above assets will enable the academic community to perform controlled scientific studies on instruction-following language models resulting in better science and ultimately new techniques to address the existing deficiencies with these models. At the same time any release carries some risk. First we recognize that releasing our training recipe reveals the feasibility of certain capabilities. On one hand this enables more people (including bad actors) to create models that could cause harm (either intentionally or not). On the other hand this awareness might incentivize swift defensive action especially from the academic community now empowered by the means to perform deeper safety research on such models. Overall we believe that the benefits for the research community outweigh the risks of this particular release. Given that we are releasing the training recipe we believe that releasing the data model weights and training code incur minimal further risk given the simplicity of the recipe. At the same time releasing these assets has enormous benefits for reproducible science so that the academic community can use standard datasets models and code to perform controlled comparisons and to explore extensions. Deploying an interactive demo for Alpaca also poses potential risks such as more widely disseminating harmful content and lowering the barrier for spam fraud or disinformation. We have put into place two risk mitigation strategies. First we have implemented a content filter using OpenAIs content moderation API which filters out harmful content as defined by OpenAIs usage policies. Second we watermark all the model outputs using the method described in Kirchenbauer et al. 2023 so that others can detect (with some probability) whether an output comes from Alpaca 7B. Finally we have strict terms and conditions for using the demo; it is restricted to non-commercial uses and to uses that follow LLaMAs license agreement . We understand that these mitigation measures can be circumvented once we release the model weights or if users train their own instruction-following models. However by installing these mitigations we hope to advance the best practices and ultimately develop community norms for the responsible deployment of foundation models. We are excited by the research opportunities that Alpaca unlocks. There are many exciting future directions: This work was done at the Center for Research on Foundation Models (CRFM) with support from the Stanford Institute for Human-Centered AI (HAI) and the Stanford Natural Language Processing (NLP) group. We also especially thank Yifan Mai for helpful engineering support for demo deployment. Alpaca depends directly and critically on existing works. We would like to thank Meta AI Research for training and releasing the LLaMA models the self-instruct team for giving us a basis for the data generation pipeline Hugging Face for the training code and OpenAI for paving the path and showing what can be achieved. We would also like to highlight that there are many other open efforts for instruction-following LLMs and chat models including OpenChatKit Open Assistant and Carper AI . Sign up to get email updates on the Center for Research on Foundation Models (CRFM) or email us at contact-crfm@stanford.edu . CRFM is grateful to our supporters . 2021. Stanford Center for Research on Foundation Models. Designed by Joon Sung Park .
809
GOOD
Altstore: Home for apps that push the boundaries of iOS (altstore.io)
825
BAD
Alzheimers amyloid hypothesis cabal thwarted progress toward a cure (2019) (statnews.com) Exclusive analysis of biotech pharma and the life sciences Topics Columns Tools Events Team Account More Follow Us By Sharon Begley June 25 2019 I n the 30 years that biomedical researchers have worked determinedly to find a cure for Alzheimers disease their counterparts have developed drugs that helped cut deaths from cardiovascular disease by more than half and cancer drugs able to eliminate tumors that had been incurable. But for Alzheimers not only is there no cure there is not even a disease-slowing treatment. The brain Alzheimers researchers patiently explain is hard harder than the heart harder even than cancer. While that may be true it is increasingly apparent that there is another more disturbing reason for the tragic lack of progress: The most influential researchers have long believed so dogmatically in one theory of Alzheimers that they systematically thwarted alternative approaches. Several scientists described those who controlled the Alzheimers agenda as a cabal. advertisement In more than two dozen interviews scientists whose ideas fell outside the dogma recounted how for decades believers in the dominant hypothesis suppressed research on alternative ideas: They influenced what studies got published in top journals which scientists got funded who got tenure and who got speaking slots at reputation-buffing scientific conferences. This stifling of competing ideas say a growing number of scholars is a big reason why there is no treatment for Alzheimers. (The four approved drugs have no effect on the disease providing only a temporary memory boost.) The scientists described the frustrating even career-ending obstacles that they confronted in pursuing their research. A top journal told one that it would not publish her paper because others hadnt. Another got whispered advice to at least pretend that the research for which she was seeking funding was related to the leading idea that a protein fragment called beta-amyloid accumulates in the brain creating neuron-killing clumps that are both the cause of Alzheimers and the key to treating it. Others could not get speaking slots at important meetings a key showcase for research results. Several who tried to start companies to develop Alzheimers cures were told again and again by venture capital firms and major biopharma companies that they would back only an amyloid approach. The amyloid hypothesis has been one of the most tragic stories [in] disease research said neurobiologist Rachael Neve of Massachusetts General Hospital. advertisement Despite being described as a cabal the amyloid camp was neither organized nor nefarious. Those who championed the amyloid hypothesis truly believed it and thought that focusing money and attention on it rather than competing ideas was the surest way to an effective drug. It has not worked out that way. Research focused on amyloid and the development and testing of experimental drugs targeting it have sucked up billions of dollars in government foundation and pharma funding with nothing to show for it. While targeting amyloid may or may not be necessary to treat Alzheimers it is not sufficient and the additional steps almost certainly include those that were ignored even censored. Probably the most shattering turn came in March when Biogen halted the study of what proponents called the most promising Alzheimers drug in years an amyloid-targeting antibody. For all her regrets about the amyloid hegemony Neve is an unlikely critic: She co-led the 1987 discovery of mutations in a gene called APP that increases amyloid levels and causes Alzheimers in middle age supporting the then-emerging orthodoxy. Yet she believes that one reason Alzheimers remains incurable and untreatable is that the amyloid camp dominated the field she said. Its followers were influential to the extent that they persuaded the National Institute of Neurological Disorders and Stroke [part of the National Institutes of Health] that it was a waste of money to fund any Alzheimers-related grants that didnt center around amyloid. To be sure NIH did fund some Alzheimers research that did not focus on amyloid. In a sea of amyloid-focused grants there are tiny islands of research on oxidative stress neuroinflammation and especially a protein called tau. But Neves NINDS program officer she said told me that I should at least collaborate with the amyloid people or I wouldnt get any more NINDS grants. (She hoped to study how neurons die.) A decade after her APP discovery a disillusioned Neve left Alzheimers research building a distinguished career in gene editing. Today she said she is sick about the millions of people who have needlessly died from the disease. Dr. Daniel Alkon a longtime NIH neuroscientist who started a company to develop an Alzheimers treatment is even more emphatic: If it werent for the near-total dominance of the idea that amyloid is the only appropriate drug target he said we would be 10 or 15 years ahead of where we are now. Making it worse is that the empirical support for the amyloid hypothesis has always been shaky. There were numerous red flags over the decades that targeting amyloid alone might not slow or reverse Alzheimers. Even at the time the amyloid hypothesis emerged 30 years ago there was concern about putting all our eggs into one basket especially the idea that ridding the brain of amyloid would lead to a successful treatment said neurobiologist Susan Fitzpatrick president of the James S. McDonnell Foundation. But research pointing out shortcomings of the hypothesis was relegated to second-tier journals at best a signal to other scientists and drug companies that the criticisms neednt be taken too seriously. A roundup of STAT's top stories of the day. Zaven Khachaturian spent years at NIH overseeing its early Alzheimers funding. Amyloid partisans he said came to permeate drug companies journals and NIH study sections the groups of mostly outside academics who decide what research NIH should fund. Things shifted from a scientific inquiry into an almost religious belief system where people stopped being skeptical or even questioning. That would be tragic enough in any area of biomedical research but its especially so in Alzheimers. Today 5.8 million people in the U.S. have the disease including 1 in 10 of those 65 and over estimates the Alzheimers Association. It is the fifth leading cause of death in that age group. For many patients and their families thats a small mercy: Robbed of their memories unable to recognize those they loved often suffering from psychosis they lose their mind and their identity long before their life. Scientists closely associated with the amyloid model argue that if alternative ideas received little funding support it was because NIHs Alzheimers budget was woefully insufficient ($425 million in 2012 $2.4 billion in 2019). Its our responsibility to choose studies that are the most promising and I think we have been doing that said Dr. Paul Aisen of the University of Southern California a leading amyloid proponent. I would reject the idea that we would have been further along if there had been more openness to other ideas. Dr. Dennis Selkoe of Harvard Medical School also a prominent amyloid researcher isnt so sure. He too says low NIH funding for Alzheimers from the 1980s through the 2000s is to blame for alternative ideas languishing. But society has the right to ask why havent we made more progress? he said. I have no doubt that if we had done broader research we would be more advanced now. I dont think there was a purposeful attempt to scuttle other approaches Selkoe added. Or as Aisen put it last week on the sidelines of the Aspen Ideas Festival I dont think Im part of a cabal. R uth Itzhaki often felt like she was in a house of mirrors. A molecular neurobiologist at Englands University of Manchester in 1991 she discovered pathogens herpes simplex virus type 1 in the brains of elderly people who had died with Alzheimers and carried the most common gene for the disease. It was the first indication that infectious agents might play a role in Alzheimers raising the possibility that eliminating them (and the resulting immune response including inflammation) might stop or even reverse it. Nearly half a dozen journals rejected Itzhakis paper before it was accepted by the Journal of Medical Virology not a bad journal but not a leading one. A frequent reason top journals declined to publish her papers as they did those of other amyloid skeptics was previous rejections. As one peer reviewer wrote about a funding proposal Itzhaki submitted in 2010 very few [of your] papers have appeared in the most highly regarded journals. And here I thought research should be judged on its own merits Itzhaki said. Like other doubters Itzhaki wasnt dismissing the idea that amyloid has a role in Alzheimers; she was questioning whether it was the cause and therefore a good drug target. She saw it as a consequence of the true cause making amyloid the gravestones of brain neurons killed by something else and not their assassins. In that case targeting amyloid would no more revive dead neurons than removing headstones would resurrect bodies in a cemetery. Funders did not beat a path to her laboratory door. When Itzhaki was an advisor on a proposed clinical trial of an antiviral drug for Alzheimers one scientist who assessed it for a private foundationwrote The novelty of this approach appears to be quite lacking according to documents she shared with STAT. To which Itzhaki wondered the thousands of clinical trials based on eliminating amyloid which keep getting funded are novel? The Alzheimers Association awards its Zenith Fellowships to scientists on the cutting edge of research acknowledging that their studies may not conform to current conventional scientific wisdom or may challenge the prevailing orthodoxy. Itzhaki thought that described her work to a T so in 2004 she applied for funding for a study on the role of herpes simplex virus in Alzheimers. The experience was that of an impala asking a pride of lions for support. One of the four reviewers gave her scores of poor (3 on a 10-point scale) on key criteria arguing that because there is no conclusive evidence for a major role of this pathogen in Alzheimers disease the research will not have an impact on advancing the field of dementia research. A second reviewer called the role of pathogens in Alzheimers a fringe topic. Although one gave Itzhaki scores of 10 (outstanding) the two dismissive reviews sank her chances. It was another demonstration of the power of the amyloid camp. Even when some reviewers were open to alternative ideas funding and publishing are so competitive a single slam meant failure. Itzhaki fared little better getting speaking slots at the fields most important annual gathering. In 2004 a colleague neuroscientist George Perry of the University of Texas at San Antonio talked the organizers of the Alzheimers Association International Conference into giving Itzhaki 10 minutes something he recalled that took a lot of persuading. That was the only time she was allowed to give a talk Itzhaki said though I applied every year. They never included viruses in the list of meeting topics and I was allowed only a poster. No one influential ever even came to see it. For all the obstacles thrown in her way in 2009 Itzhaki showed that herpes simplex virus type 1 is a strong risk factor for Alzheimers and in 2007 that beta-amyloid accumulates in mouse brains that are infected with it. Studying mouse brains and patients brains she found evidence that this virus is a major cause of amyloid plaques and hence probably a significant [causative] factor in Alzheimers disease. Last year two studies by teams at Mount Sinai and Harvard tied infectious agents to Alzheimers more strongly than any previous research had supporting the idea that targeting pathogens and not the response to pathogens (amyloid plaques) might prevent or slow Alzheimers. In fact a little-noticed study in Taiwan also published last year found that people diagnosed with herpes infections were 2.6 times as likely to develop dementia as herpes-free individuals but that antiviral drugs cut the risk 90 percent. In 2017 the first clinical trial investigating antiviral drugs in people with mild to moderate Alzheimers got underway at Columbia University. I just wonder if wed be farther along if those of us studying the role of pathogens in Alzheimers had gotten the imprimatur of journals and meetings that the amyloid people did Itzhaki said. I n fairness the true believers had evidence implicating amyloid in Alzheimers disease. In 1906 when German neuropathologist Dr. Alois Alzheimer examined the brain of a 51-year-old woman who had died from what he called presenile dementia he identified sticky plaques between neurons and tangles of filamentous proteins within them. Plaques in particular and tangles secondarily have been the diseases defining characteristics ever since. Flash forward to 1984: Scientists determined the weight length and precise molecular sequence of the protein fragment or peptide that makes up the brain plaques. It has come to be known as beta-amyloid. Hard on the heels of the 1987 discovery by Neve and her colleagues of one amyloid-related gene came several more all linking mutations in genes called PS1 PS2 and APP to early-onset Alzheimers which accounts for about 5% of cases. The 1991 discovery of one APP mutation (there are many) was the most-cited paper in all of biomedicine that year. The mutations all raise amyloid levels and people with them develop Alzheimers before age 65. In an era enamored of genes for this or that disease it was an article of faith that genetics would produce cures. When the genetics discoveries came out it all pointed to amyloid having a critical role said UTs Perry a longtime critic of the amyloid model. By the mid-1990s a now-defunct San Francisco biotechnology company Athena Neurosciences created the first genetically engineered mice with a mutated amyloid-producing human gene. The animals brains filled with amyloid plaques and their memories were destroyed. The mice were hailed as a model for testing therapeutic [Alzheimers] drugs and rodents that shook the world: Eliminating the amyloid in their brains at least partly reversed their memory loss and other cognitive deficits. If you stopped the amyloid the mice got better said Alkon the former NIH neuroscientist. Everyone said this must be the way to treat Alzheimers. And so they tried. The biopharma giants Pfizer and Eli Lilly and Merck and Biogen the now-defunct Elan (which acquired Athena in 1996) and hundreds of academic researchers each crafted a vaccine or antibody or small molecule to prevent the formation of amyloid plaques to remove soluble (pre-plaque) amyloid or to destroy the plaques. You had a whole industry going after amyloid hundreds of clinical trials targeting it in different ways Alkon said. Despite success in millions of mice none of it worked in patients. Scientists who raised doubts about the amyloid model suspected why. Amyloid deposits they thought are a response to the true cause of Alzheimers and therefore a marker of the disease again the gravestones of neurons and synapses not the killers. The evidence? For one thing although the brains of elderly Alzheimers patients had amyloid plaques so did the brains of people the same age who died with no signs of dementia a pathologist discovered in 1991. Why didnt amyloid rob them of their memories? For another mice engineered with human genes for early Alzheimers developed both amyloid plaques and dementia but there was no proof that the much more common late-onset form of Alzheimers worked the same way. And yes amyloid plaques destroy synapses (the basis of memory and every other brain function) in mouse brains but there is no correlation between the degree of cognitive impairment in humans and the amyloid burden in the memory-forming hippocampus or the higher-thought frontal cortex. There were so many clues said neuroscientist Nikolaos Robakis of the Icahn School of Medicine at Mount Sinai who also discovered a mutation for early-onset Alzheimers. Somehow the field believed all the studies supporting it but not those raising doubts which were very strong. The many weaknesses in the theory were ignored. Reluctance to admit that maybe we have it wrong as the McDonnell Foundations Fitzpatrick put it made it difficult for researchers who mindful of those red flags wanted to explore ways to understand prevent and treat Alzheimers that didnt revolve around amyloid. Alternative ideas ranged from infectious microbes and inflammation as causes to blood flow and synapse restoration as treatments. Today therapies based on all of those ideas and more are being tested in people evidence of their promise. But many of the scientists with alternative ideas became roadkill on the highway to nowhere Perry said their careers stalled or worse. Perry fared better rising to become dean of the College of Sciences at UT San Antonio but saw many of his NIH grant proposals rejected by NIH study sections. A friend on one told me my NIH grant was sacked due to my anti-amyloid stance Perry said. The sacker came up to me at a meeting a few weeks later and asked why I wrote articles questioning amyloid. N ot every young scientist has the stomach for the Alzheimers wars. Neurophysiologist Mal Tansey of Emory University has focused on neuroinflammation for nearly two decades convinced that this hyped-up immune response kills synapses and neurons and explains many other discoveries about Alzheimers brains. If thats right it suggests a role for non-amyloid drug targets she said. But this was no subject for a young scientist trying to gain a foothold in the field. In her first university talk as a junior faculty member in 2002 Tansey described her research. A senior faculty member stood up and said neuroinflammation has nothing to do with Alzheimers she recalled. It was intimidating. She has since focused on Parkinsons disease but still has two grants to study neuroinflammation in Alzheimers. Similarly at Mount Sinai one of Robakiss postdoctoral fellows witnessed Robakiss NIH grant proposals to study things like neuronal survival (without a role for amyloid) getting such a low score that the study section didnt even discuss it. He left Alzheimers to study something safer the blood-brain barrier Robakis said. That happened all the time. For young academics biotech executive Dr. Raymond Tesi said its difficult to break into a field with so many strong voices supporting a single target. Alzheimers has egos and superstars and big personas unlike anything Ive seen elsewhere. Tesi was persuaded enough by the neuroinflammation explanation of Alzheimers that he vowed to found a company to develop a treatment based on it. Starting nearly a decade ago he said I talked to everyone: Lilly and Novartis venture capitalists arguing that neuroinflammation was the core pathology of Alzheimers. We couldnt get past the front door. If you werent doing amyloid you could barely get a meeting. He called it an example of the groupthink that occurs in biopharma. Every company goes to key opinion leaders in academia and asks What should we do? Since amyloid had a lock on such leaders from Harvard to the University of Southern California and in between the answer was always the same: eliminate amyloid. In 2015 Tesi and others eventually scraped together enough support including a $1 million grant from the Alzheimers Association to found INmune Bio and develop a compound to quell neuroinflammation. That year he estimated 90% of NIH industry and private foundation spending on Alzheimers research and drug development was premised on the idea that eliminating or reducing amyloid was the route to success. INmune launched a Phase 1 clinical trial this month of its anti-inflammatory agent XPro1595. It may well fail but INmune is targeting people with mild to moderate Alzheimers whom virtually every big biopharma has given up on. If an anti-inflammatory helps such patients that could have been discovered millions of ravaged minds ago. We probably lost five years said Tesi the companys chief executive. Alkon said he lost at least that many. During three decades at NIH he did groundbreaking work on the cellular and molecular basis of memory rose to the position of lab director published hundreds of scientific papers and never cured anyone of anything. But in 1999 he felt he just might. He therefore resigned from NIH to head a cure-focused neurological institute founded by the Rockefeller family studying an odd compound called bryostatin-1 (made by sea mosses of all things) and its remarkable ability to increase synapse-boosting molecules. Hoping to turn bryostatin into an Alzheimers drug Alkon co-founded Neurotrope BioScience in 2012 and soon tried to interest a leading pharmaceutical company in collaborating to develop bryostatin faster than a little startup could alone. In a meeting at the drug makers headquarters Alkon ran through the data. In human neurons growing in lab dishes bryostatin provided protection against amyloid and preserved synapses. In mice it improved learning and memory even when amyloid levels remained high. It not only preserved synapses it also sopped up amyloid molecules the protein fragments whose clumping into sticky plaques between brain neurons is considered the hallmark of the disease. And in earlier studies when tested against cancer bryostatin was extremely safe. Cool great the executives said. There was just one thing. The only way they would consider bryostatin was if we could show it was superb at stopping amyloid Alkon recalled. They were just interested in that not its effect on synapses whose loss causes the memory and other cognitive decline of Alzheimers patients and which Alkon believes is the key to treating the disease. The company (which he declined to name so as not to jeopardize future collaborations) passed. It had better ways to eliminate amyloid it decided and didnt much care about restoring synapses. The next companies Alkon approached said much the same thing: It was all amyloid all the time. There werent many I didnt talk to he said. No one claims that if deep-pocketed pharma had gotten behind bryostatin seven years ago there would be a treatment for Alzheimers today. The experimental drug produced so-so results in a small clinical trial according to data published in January in the Journal of Alzheimers Disease though it improved cognitive function enough in some patients that Neurotrope is running a larger study convinced the improvement is real. T he failure of every amyloid-based experimental compound has finally triggered soul searching about how it all went so wrong that in 2019 there is nothing for people who develop Alzheimers and likely nothing for many more years. What happened? People who said wait it might not be so simple as eliminating amyloid they were not able to go against the wave said Mount Sinais Robakis. Critical thinking gave way to dogma. What you believe can be influenced by what is in and what is out. How else do you explain the widespread acceptance of a theory despite its weaknesses? Robakis has been continually funded by NIH (including for studies unrelated to Alzheimers) so his criticism isnt sour grapes. Yet he began to feel that the amyloid camp saw him as a traitor the more he pointed out flaws in the theory. I definitely lost grants he said. If amyloid wasnt in the grant proposal it was an uphill battle. There were very big egos involved and they couldnt stand to be wrong. It wasnt science anymore. He paused. We should have known better he continued. You cant say what would have happened if things had been different but maybe if there had been more support for alternative ideas we would be better off [in terms of Alzheimers treatments] than we are. It isnt hard to understand why hundreds of academics lined up behind the amyloid model over the years Fitzpatrick said. Once a field commits to a particular hypothesis the research resources funding experimental models and training all get in line she wrote in a 2018 analysis. That brings backers of the dominant idea accolades awards lucrative consulting deals and well-paid academic appointments. Admitting doubt let alone error would be not only be a blow to the ego but also a threat to livelihood. Academics who took part in clinical trials of amyloid-premised drugs greeted each failure with some lame excuse said Jack de la Torre of the University of Texas Austin who studies the idea that reduced blood flow within the brain is a key contributor to Alzheimers. This way the money from big pharma would mercifully not dry up. Harder to understand is why drug companies embraced the dogma even after the repeated failures of experimental drugs based on it which has cost them billions of dollars. A longtime pharma scientist who recently joined a biotech startup offered one explanation: If company executives greenlight the development of an amyloid drug and it fails they dont lose their jobs because the smartest guys in the room meaning academia said this was the way to go he said. But if you greenlighted a different kind of Alzheimers therapy and it failed good luck with your career. While there is growing recognition that there could have been more progress if non-amyloid ideas had received greater support early on those alternatives are now being explored in both basic research and clinical trials. The NIH for instance is funding the 130-patient study of whether an antiviral can help Alzheimers patients; Columbias Dr. Davangere Devanand who is leading it expects results in three years. The increasing diversity of approaches might finally bring help to the millions of people already suffering from Alzheimers and the tens of millions more who will develop it.The tragedy is said UTs Perry we could have gotten to this point many many years sooner. Senior Writer Science and Discovery (1956-2021) Sharon covered science and discovery. dementia neurology neuroscience Brilliant article Sharon. Sadly I know quite a few great people who have this horrible complaint. Also tragically if you were to substitute the great man made global warming fraud for the Alzheimers research closed shop they would be indistinguishable. Great article and not a surprise. This happens in every area of science to which I have had exposure sometimes producing near violemt confrontations and certianly damaging the careers of the non believers. The dynamic is the same in climate research today near identical. I have for some time seen amyloid as a normal product of brain function but its likely the dysfunctional glympathic system is the causeHow to improve glympathic performance during sleep? I would be interested to look at sleep performance alzheimers onset and epinephrine variation. I suspect that sleep disturbance increase BA production as a result of elevated ephinephrine resulting from periodic low blood oxygenation during sleep. Now if this was the case there might be a negative relationship between beta blocker use and alzheimers. Great article thanks. As a taxpayer and a daughter of a mom with memory lost I find this egotistical stance of the academia very ineffective to say it mildly. We need to create a regulation that alternative scientific ideas are not discriminated against by big wig bigoted scientists. Thank you Sharon Begley for illuminating the massive failure of the amyloid hypothesis. US-based pathologist Alan MacDonald first found evidence of Borrelia bacteria in Alzheimer brain tissue as long ago as the 1980s. A few years ago he found it again this time using an extremely accurate DNA hybridisation technique (molecular beacon DNA probes specific for Lyme bacteria Borrelia burgdorferi and Borrelia miyamotoi) . Additionally he found evidence that the Borrelia bacteria shelter inside parasitic worms in the brain where the immune system (and antibiotics) are hard-pressed to reach them. There are incredible micrographs of his work at the site of the Dr Paul H Duray Research Fellowship Foundation including some showing paired microscope images of the same specimen stained first with a reagent that detects amyloid then with the Lyme Disease Borrelia DNA probes. The photos look almost IDENTICAL indicating that wherever the amyloid is to be found the Borrelia is too. And conversely where there is no amyloid plaque the Borrelia is absent. Borrelia uses a number of tricks to modulate and evade our immune system which theoretically could allow other pathogens into the brain (this could explain the findings of Dr Itzhaki & other researchers). Its tragic that the work of MacDonald and others has been suppressed due to military objectives; however in July 2019 Congressmen demanded that the Pentagon commit to transparency on its work with tick-borne diseases such as Lyme Disease. If we can enforce this one day we may be able to alleviate prevent and perhaps even cure Alzheimers Disease. The Macdonald research and micrographs are viewable here: https://www.durayresearch.wordpress.com nb I have left my email address as required by your form but it appears to be hacked. A thoroughly informative and accessible piece. I would never have thought academic tribalism would enter into a field as noble as dementia research. Thank goodness nothing in this article could possibly be applied to climate science where there is never any pressure to enforce a consensus skepticism is championed and skeptics treated with respect and contrary evidence is scrupulously researched and evaluated and there is zero pressure to toe the line or lose access to publication funds or even a career. All too familiar see Lee Smolin the Trouble with Physics (2006) on the almost identical research stranglehold of the String Theory cabal in theoretical physics. I found this article amazing and frightening. So many vested interests working against a cure shakes all your faith in scientific investigation. The scientific establishment is fallible. The standard practices for research and publication are flawed. But the scientific method itself is actually pretty darn good and shouldnt absorb more than a little bit of blame. Thank you for your article. It shed a much needed light into the psychology behind research. Sadly this is true in so many areas of medicine. The battle for Helico bacter being an example. Fundamental research into diabetes as to why? To make just two. What hope for a vaccine akin to HPV ? Regards Chris Along the lines of Buzzs comment I would like to see STAT be more careful about attacking small biotech companies. There are certain other STAT authors who are somewhat infamous for this habit (not this author). These small companies are often trying new approaches and they dont need extra criticism for it to scare away investors. The seed for the next Alzheimers drug may very well originate from a small biotech that is easy to criticize in the early stages. Older Comments Comments are closed. advertisement Reporting from the frontiers of health and medicine You've been selected! Subscribe to STAT+ for less than $2 per day Unlimited access to essential biotech medicine and life sciences journalism Subscribe to STAT+ for less than $2 per day Unlimited access to the health care news and insights you need Subscribe to STAT+ for less than $2 per day Unlimited access to essential biotech medicine and life sciences journalism Become a STAT+ subscriber today! Unlimited access to essential biotech medicine and life sciences journalism
831
BAD
Amazon Buys One Medical (onemedical.com) Amazon and One Medical have teamed up. Get your first year of membership for $199 $144 . Together Amazon and One Medical will help more people get better care when and how they need it For a limited time One Medical membership is available to new U.S. customers for $144 (28% discount) for the first yearthe equivalent of only $12 per month SEATTLE & SAN FRANCISCO(BUSINESS WIRE)February 22 2023Today Amazon (NASDAQ: AMZN) and 1Life Healthcare Inc. (NASDAQ: ONEM One Medical) announced that Amazon completed its acquisition of One Medical. One Medicals seamless in-office and 24/7 virtual care services on-site labs and programs for preventive care chronic care management common illnesses and mental health concerns have been delighting people for the past 15 years. Together Amazon and One Medical look to deliver exceptional health care to more people to achieve better health outcomes better care experiences and more value within a better care team environment.For a limited time One Medical is offering annual memberships at the discounted price of $144 for the first year (regularly $199/year) the equivalent of $12 per month to new customers. Redeem the One Medical membership promotion and learn more about whats included. We're on a mission to make it dramatically easier for people to find choose afford and engage with the services products and professionals they need to get and stay healthy and coming together with One Medical is a big step on that journey said Neil Lindsay senior vice president of Amazon Health Services. One Medical has set the bar for what a quality convenient and affordable primary care experience should be like. Were inspired by their human-centered technology-forward approach and excited to help them continue to grow and serve more patients. One Medical has been on a mission to help transform health care through its human-centered and technology-powered model to delight people with better health better care and better value within a better team environment said Amir Dan Rubin CEO of One Medical. We now set our sights on delivering even further positive impacts for consumers employers care teams and health networks as we join Amazon with its long-term orientation history of invention and passion for reimagining a better future. If you fast forward 10 years from now people are not going to believe how primary care was administered. For decades you called your doctor made an appointment three or four weeks out drove 15-20 minutes to the doctor parked your car signed in and waited several minutes in reception eventually were placed in an exam room where you waited another 10-15 minutes before the doctor came in saw you for five to ten minutes and prescribed medicine and then you drove 20 minutes to the pharmacy to pick it upand thats if you didnt have to then go see a specialist for additional evaluation where the process repeated and could take even longer for an appointment said Amazon CEO Andy Jassy. Customers want and deserve better and thats what One Medical has been working and innovating on for more than a decade. Together we believe we can make the health care experience easier faster more personal and more convenient for everyone. One Medical sets a high bar for human-centered primary care experiences: Access to primary care where when and how people prefer with: A comprehensive approach to make health care easier to navigate by offering: A more human health care experience enabled by: For customers looking for additional information here are answers to some frequently asked questions: What does the One Medical membership fee cover? The One Medical annual membership fee covers access to One Medicals on-demand and asynchronous virtual care services as well as high-touch and value-added personal services such as insurance navigation and referral management. The on-demand virtual services are accessible through the One Medical app (at no additional cost) featuring 24/7/365 on-demand video chats secure provider messaging Treat Me Now assessments for common health concerns easy vaccine and medical record access prescription renewals and proactive reminders for follow-up care and referral needs. What does One Medical joining the Amazon family mean for current One Medical members? One Medical members can continue to count on One Medical for the high-quality care and high-level of service they receive. One Medical looks forward to continuing to increase access to care and improve health outcomes with the support of Amazon. How does One Medical differ from a typical primary care or urgent care practice? One Medical aims to make health care easier to access and even enjoyable while improving health outcomes for people across every stage of life. One Medical offers same and next-day in-office or remote visits seamlessly paired with 24/7 on-demand virtual care services through the One Medical mobile or web app allowing members to seek care when and where its most convenient to them. One Medical is designed to better serve the needs of its patients and providers including through thoughtfully designed and welcoming offices across the U.S. appointments that start on time more appointment time with providers and onsite labs. One Medical provides a comprehensive and human-centered primary care experience including preventive and everyday health visits chronic care management pediatric and mental health services in a growing number of locations and more . Members enjoy access to 24/7/365 virtual care services through the One Medical app which allows them to continue their care from the comfort of home or on the go whether its for acute needs at odd-hours or to simply manage follow-up needs and prescription renewals. Do Amazon Prime members receive One Medical memberships? No One Medical membership is separate from Prime membership. For a limited time One Medical membership regardless of Prime membership is available to new U.S. customers for $144 for the first year (regularly $199/year) the equivalent of $12 per month. Visit here to redeem the One Medical membership promotion and learn more about whats included. How do Amazon and One Medical protect private health information? The Health Insurance Portability and Accountability Act (HIPAA) governs what One Medical Amazon and others can do with Protected Health Information and this includes information like medication history medical conditions and treatment information. Amazon and One Medical have extensive experience protecting data of all kinds appropriately across a variety of businesses and nothing about this acquisition changes Amazon or One Medicals commitment to privacy or the strong protections we have for Protected Health Information. AboutAmazon Amazon is guided by four principles: customer obsession rather than competitor focus passion for invention commitment to operational excellence and long-term thinking. Amazon strives to be Earths Most Customer-Centric Company Earths Best Employer and Earths Safest Place to Work. Customer reviews 1-Click shopping personalized recommendations Prime Fulfillment by Amazon AWS Kindle Direct Publishing Kindle Career Choice Fire tablets Fire TV Amazon Echo Alexa Just Walk Out technology Amazon Studios and The Climate Pledge are some of the things pioneered by Amazon. For more information visit amazon.com/about and follow @AmazonNews . About One Medical One Medical is aU.S.national human-centered and technology-powered primary care organization with seamless digital health and inviting in-office care convenient to where people work shop live and click. One Medicals vision is to delight millions of members with better health and better care while reducing costs within a better team environment. One Medicals mission is to transform health care for all through a human-centered technology-powered model. Headquartered inSan Francisco1Life Healthcare Inc.is the administrative and managerial services company for the affiliated One Medical physician-owned professional corporations that deliver medical services in-office and virtually. 1Life and the One Medical entities do business under the One Medical brand. Cautionary Statement Regarding Forward-Looking Statements This press release contains forward-looking statements. We use words and phrases such as anticipate believe expect intend look to set our sights on mission future and similar expressions to identify forward-looking statements. Actual results could differ materially from those contemplated by the forward-looking statements. Factors that could cause actual results to differ materially include the following: Amazon may be unable to achieve the anticipated benefits of the transaction; Amazon may be unsuccessful in integrating One Medicals business into its operations; operating costs customer loss and business disruption (including without limitation difficulties in maintaining relationships with employees customers partners and commercial counterparties) may be greater than expected; Amazon may assume unexpected risks and liabilities; initiatives with One Medical may distract Amazons management from other important matters; regulatory impediments may arise either with respect to the acquisition itself or Amazons operation of One Medicals business; and the other factors discussed in Risk Factors in Amazons Annual Report on Form 10-K for the fiscal year ended December 31 2022 and in Amazons other filings with the SEC which are available at https://www.sec.gov . Amazon assumes no obligation to update the information in this press release except as otherwise required by law. Readers are cautioned not to place undue reliance on these forward-looking statements that speak only as of the date hereof. Contacts Amazon.com Inc. Media Hotline Amazon-pr@amazon.com www.amazon.com/pr
838
GOOD
Amazon CodeWhisperer Free for Individual Use Is Now Generally Available (amazon.com) Today Amazon CodeWhisperer a real-time AI coding companion is generally available and also includes a CodeWhisperer Individual tier thats free to use for all developers. Originally launched in preview last year CodeWhisperer keeps developers in the zone and productive helping them write code quickly and securely and without needing to break their flow by leaving their IDE to research something. Faced with creating code for complex and ever-changing environments developers can improve their productivity and simplify their work by making use of CodeWhisperer inside their favorite IDEs including Visual Studio Code IntelliJ IDEA and others. CodeWhisperer helps with creating code for routine or time-consuming undifferentiated tasks working with unfamiliar APIs or SDKs making correct and effective use of AWS APIs and other common coding scenarios such as reading and writing files image processing writing unit tests and lots more. Using just an email account you can sign up and in just a few minutes become more productive writing codeand you dont even need to be an AWS customer. For business users CodeWhisperer offers a Professional tier that adds administrative features like SSO and IAM Identity Center integration policy control for referenced code suggestions and higher limits on security scanning. And in addition to generating code suggestions for Python Java JavaScript TypeScript and C# the generally available release also now supports Go Rust PHP Ruby Kotlin C C++ Shell scripting SQL and Scala. CodeWhisperer is available to developers working in Visual Studio Code IntelliJ IDEA CLion GoLand WebStorm Rider PhpStorm PyCharm RubyMine and DataGrip IDEs (when the appropriate AWS extensions for those IDEs are installed) or natively in AWS Cloud9 or AWS Lambda console. Helping to keep developers in their flow is increasingly important as facing increasing time pressure to get their work done developers are often forced to break that flow to turn to an internet search sites such as StackOverflow or their colleagues for help in completing tasks. While this can help them obtain the starter code they need its disruptive as theyve had to leave their IDE environment to search or ask questions in a forum or find and ask a colleaguefurther adding to the disruption. Instead CodeWhisperer meets developers where they are most productive providing recommendations in real time as they write code or comments in their IDE. During the preview we ran a productivity challenge and participants who used CodeWhisperer were 27% more likely to complete tasks successfully and did so an average of 57% faster than those who didnt use CodeWhisperer. Code generation from a comment The code developers eventually locate may however contain issues such as hidden security vulnerabilities be biased or unfair or fail to handle open source responsibly. These issues wont improve the developers productivity when they later have to resolve them. CodeWhisperer is the best coding companion when it comes to coding securely and using AI responsibly. To help you code responsibly CodeWhisperer filters out code suggestions that might be considered biased or unfair and its the only coding companion that can filter or flag code suggestions that may resemble particular open-source training data. It provides additional data for suggestionsfor example the repository URL and licensewhen code similar to training data is generated helping lower the risk of using the code and enabling developers to reuse it with confidence. Open-source reference tracking CodeWhisperer is also the only AI coding companion to have security scanning for finding and suggesting remediations for hard-to-detect vulnerabilities scanning both generated and developer-written code looking for vulnerabilities such as those in the top ten listed in the Open Web Application Security Project (OWASP) . If it finds a vulnerability CodeWhisperer provides suggestions to help remediate the issue. Scanning for vulnerabilities Code suggestions provided by CodeWhisperer are not specific to working with AWS. However CodeWhisperer is optimized for the most-used AWS APIs for example AWS Lambda or Amazon Simple Storage Service (Amazon S3) making it the best coding companion for those building applications on AWS. While CodeWhisperer provides suggestions for general-purpose use cases across a variety of languages the tuning performed using additional data on AWS APIs means you can be confident it is the highest quality most accurate code generation you can get for working with AWS. Meet Your new AI Code Companion Today Amazon CodeWhisperer is generally available today to all developersnot just those with an AWS account or working with AWSwriting code in Python Java JavaScript TypeScript C# Go Rust PHP Ruby Kotlin C C++ Shell scripting SQL and Scala. You can sign up with just an email address and as I mentioned at the top of this post CodeWhisperer offers an Individual tier thats freely available to all developers. More information on the Individual tier and pricing for the Professional tier can be found at https://aws.amazon.com/codewhisperer/pricing . Steve Roberts is a Senior Developer Advocate focused on .NET and PowerShell development on AWS. Based in Seattle Washington Steve worked as a Senior Development Engineer on the AWS SDKs and tools for .NET and PowerShell developers. He was the development lead for the AWS Tools for PowerShell and the AWS Tools for Azure DevOps and also worked on the AWS Toolkits for Visual Studio and Visual Studio Code plus the AWS SDK for .NET. Follow him on Twitter @bellevuesteve. View Comments
841
BAD
Amazon to lay off 9000 more workers after earlier cuts (cnbc.com) Credit Cards Loans Banking Mortgages Insurance Credit Monitoring Personal Finance Small Business Taxes Help for Low Credit Scores Investing SELECT All Credit Cards Find the Credit Card for You Best Credit Cards Best Rewards Credit Cards Best Travel Credit Cards Best 0% APR Credit Cards Best Balance Transfer Credit Cards Best Cash Back Credit Cards Best Credit Card Welcome Bonuses Best Credit Cards to Build Credit SELECT All Loans Find the Best Personal Loan for You Best Personal Loans Best Debt Consolidation Loans Best Loans to Refinance Credit Card Debt Best Loans with Fast Funding Best Small Personal Loans Best Large Personal Loans Best Personal Loans to Apply Online Best Student Loan Refinance SELECT All Banking Find the Savings Account for You Best High Yield Savings Accounts Best Big Bank Savings Accounts Best Big Bank Checking Accounts Best No Fee Checking Accounts No Overdraft Fee Checking Accounts Best Checking Account Bonuses Best Money Market Accounts Best CDs Best Credit Unions SELECT All Mortgages Best Mortgages Best Mortgages for Small Down Payment Best Mortgages for No Down Payment Best Mortgages with No Origination Fee Best Mortgages for Average Credit Score Adjustable Rate Mortgages Affording a Mortgage SELECT All Insurance Best Life Insurance Best Homeowners Insurance Best Renters Insurance Best Car Insurance Travel Insurance SELECT All Credit Monitoring Best Credit Monitoring Services Best Identity Theft Protection How to Boost Your Credit Score Credit Repair Services SELECT All Personal Finance Best Budgeting Apps Best Expense Tracker Apps Best Money Transfer Apps Best Resale Apps and Sites Buy Now Pay Later (BNPL) Apps Best Debt Relief SELECT All Small Business Best Small Business Savings Accounts Best Small Business Checking Accounts Best Credit Cards for Small Business Best Small Business Loans Best Tax Software for Small Business SELECT All Taxes Best Tax Software Best Tax Software for Small Businesses Tax Refunds SELECT All Help for Low Credit Scores Best Credit Cards for Bad Credit Best Personal Loans for Bad Credit Best Debt Consolidation Loans for Bad Credit Personal Loans if You Don't Have Credit Best Credit Cards for Building Credit Personal Loans for 580 Credit Score or Lower Personal Loans for 670 Credit Score or Lower Best Mortgages for Bad Credit Best Hardship Loans How to Boost Your Credit Score SELECT All Investing Best IRA Accounts Best Roth IRA Accounts Best Investing Apps Best Free Stock Trading Platforms Best Robo-Advisors Index Funds Mutual Funds ETFs Bonds In this article Amazon will lay off 9000 more employees in the coming weeks CEO Andy Jassy said in a memo to staff on Monday. The cuts are on top of the previously announced layoffs that began in November and extended into January. That round totaled more than 18000 employees and primarily affected staffers in its retail devices recruiting and human resources groups. Amazonmade the decision to lay off more employees as it looks to streamline costs. It took into account the economy as well as the uncertainty that exists in the near future Jassy said. The company just wrapped up the second phase of its annual budgeting process referred to internally as OP2. The overriding tenet of our annual planning this year was to be leaner while doing so in a way that enables us to still invest robustly in the key long-term customer experiences that we believe can meaningfully improve customers' lives and Amazon as a whole Jassy said. The latest round will primarily impact Amazon's cloud computing human resources advertising and Twitch livestreaming businesses Jassy said in the memo. Twitch CEO Dan Clancy said approximately 400 people would be let go as part of Amazon's latest round of job cts. Clancy pointed to the economic downturn and said Twitch's user and revenue growth hasn't kept pace with our expectations. Amazon is undergoing the largest layoffs in company history after it went on a hiring spree during the Covid pandemic . The company's global workforce swelled to more than 1.6 million by the end of 2021 up from 798000 in the fourth quarter of 2019. Jassy is also taking on a broad overview of the company's expenses as it reckons with an economic downturn and slowing growth in its core retail business. Amazon froze hiring in its corporate workforce axed some experimental projects and slowed warehouse expansion. While the company aims to operate leaner this year Jassy said he remains optimistic about the company's largest businesses retail and Amazon Web Services as well as other new divisions that continue to warrant investment. Amazon shares closed down 1% on Monday. Here's the full memo from Jassy: As we've just concluded the second phase of our operating plan (OP2) this past week I'm writing to share that we intend to eliminate about 9000 more positions in the next few weeksmostly in AWS PXT Advertising and Twitch. This was a difficult decision but one that we think is best for the company long term. Let me share some additional context. As part of our annual planning process leaders across the company work with their teams to decide what investments they want to make for the future prioritizing what matters most to customers and the long-term health of our businesses. For several years leading up to this one most of our businesses added a significant amount of headcount. This made sense given what was happening in our businesses and the economy as a whole. However given the uncertain economy in which we reside and the uncertainty that exists in the near future we have chosen to be more streamlined in our costs and headcount. The overriding tenet of our annual planning this year was to be leaner while doing so in a way that enables us to still invest robustly in the key long-term customer experiences that we believe can meaningfully improve customers' lives and Amazon as a whole. As our internal businesses evaluated what customers most care about they made re-prioritization decisions that sometimes led to role reductions sometimes led to moving people from one initiative to another and sometimes led to new openings where we don't have the right skills match from our existing team members. This initially led us to eliminate 18000 positions (which we shared in January); and as we completed the second phase of our planning this month it led us to these additional 9000 role reductions (though you will see limited hiring in some of our businesses in strategic areas where we've prioritized allocating more resources). Some may ask why we didn't announce these role reductions with the ones we announced a couple months ago. The short answer is that not all of the teams were done with their analyses in the late fall; and rather than rush through these assessments without the appropriate diligence we chose to share these decisions as we've made them so people had the information as soon as possible. The same is true for this note as the impacted teams are not yet finished making final decisions on precisely which roles will be impacted. Once those decisions have been made (our goal is to have this complete by mid to late April) we will communicate with the impacted employees (or where applicable in Europe with employee representative bodies). We will of course support those we have to let go and will provide packages that include a separation payment transitional health insurance benefits and external job placement support. If I go back to our tenetbeing leaner while doing so in a way that enables us to still invest robustly in the key long-term customer experiences that we believe can meaningfully improve customers' lives and Amazon as a wholeI believe the result of this year's planning cycle is a plan that accomplishes this objective. I remain very optimistic about the future and the myriad of opportunities we have both in our largest businesses Stores and AWS and our newer customer experiences and businesses in which we're investing. To those ultimately impacted by these reductions I want to thank you for the work you have done on behalf of customers and the company. It's never easy to say goodbye to our teammates and you will be missed. To those who will continue with us I look forward to partnering with you as we make life easier for customers every day and relentlessly inventing to do so. Andy Got a confidential news tip? We want to hear from you. Sign up for free newsletters and get more CNBC delivered to your inbox Get this delivered to your inbox and more info about our products and services. 2023 CNBC LLC. All Rights Reserved. A Division of NBCUniversal Data is a real-time snapshot *Data is delayed at least 15 minutes. Global Business and Financial News Stock Quotes and Market Data and Analysis. Data also provided by
856
BAD
American society is so focused on race that it is blind to class (economist.com) A ffirmative action in American college admissions may be about to end. On October 31st the Supreme Court heard two cases in which lawyers argued that the current practicewhich allows universities to favour applicants of some races over othersviolates civil-rights laws and the constitution. Judging by the sceptical questioning of the conservative justices who thanks to Donald Trump now command a majority the question is not whether such preferences will be restricted but whether they will survive at all. Your browser does not support the <audio> element. For more than 40 years the court has allowed some positive discrimination. But it has done so with discomfort. Too-obvious tactics like racial quotas or awarding points for skin colour were ruled excessive. The compromise was to consider race as one part of holistic admissions in a way that made its weight hard to discern. In 2003 Justice Sandra Day OConnor declared the practice ought to be time-limited expecting it to be unnecessary 25 years from then. If the court rules as expected in June 2023 five years ahead of Ms OConnors schedule there will be some sorrow but hardly the same backlash as met the overturning of the right to abortion set in Roe v Wade . Surveys show that majorities of African-Americans Californians Democrats and Hispanics all oppose the use of race in college admissions (and in other areas). The demise of this unpopular scheme will offer a chance to build something better. A diversity of backgrounds in elite institutions is a desirable goal. In pursuing it though how much violence should be done to other liberal principlesfairness meritocracy the treatment of people as individuals and not avatars for their group identities? At present the size of racial preferences is large and hard to defend. The child of two college-educated Nigerian immigrants probably has more advantages in life than the child of an Asian taxi driver or a white child born into Appalachian poverty. Such backgrounds all add to diversity. But under the current regime the first is heavily more favoured than the others. Racial preferences are not however the most galling thing about the ultra-selective universities that anoint Americas elite. The legal case against Harvard one of the universities defending itself before the Supreme Court has prised open its admissions records to show the scale of unjustified advantage showered upon the already privilegeddisproportionately those who are white and wealthy. A startling 43% of white students admitted to Harvard enjoy some kind of non-academic admissions preference: being an athlete the child of an alumnus or a member of the deans list of special applicants (such as the offspring of powerful people or big donors). A cynic could argue that racial balancing works as a virtue-signalling veneer atop a grotesquely unfair system. A study published in 2017 found that most of Harvards undergraduates hailed from families in the top 10% of the income distribution. Princeton had more students from the top 1% than the bottom 60%. When this is the case it seems unfair that it is often minority studentsnot the trust-funderswho have their credentials questioned. University presidents and administrators who preen about all their diverse classes might look at how Britaina country of kings queens knights and lordshas fostered a university system that is less riven with ancestral privilege. Unfairness in American education will not be fixed by one court ruling. But it will shock a system in need of reform. Legacy admissions should be ended. Colleges claiming that alumni donations would wither without them should look to Caltech MIT and Johns Hopkinstop-notch institutions that ditched the practice and as The Economist went to press still seemed reputable and solvent. Blunt racial preferences will probably need to be replaced in response to the Supreme Court. But a less socially divisive system based on income could take their place. That would do a better job of taking actual disadvantage into account. It would still favour non-white and non-Asian Americans because they are more likely to be poorer but would do so using a racially neutral method. In some ways the question of who gets into a handful of elite universities is a distraction from the deeper causes of social immobility in America. Schooling in poorer neighbourhoods was dismal even before covid-19. The long school closures demanded by teachers unions wiped out two decades of progress in test scores for nine-year-olds with hard-up black and Hispanic children worst affected. Efforts to help the needy should start before birth and be sustained throughout childhood. Nothing the Supreme Court says about the consideration of race in college admissions will affect the more basic problem that too few Americans from poorer families are sufficiently well-nurtured or well-taught to be ready to apply to college. However the court rules that is a debate America needs to have. This article appeared in the Leaders section of the print edition under the headline Making a meritocracy Discover stories from this section and more in the list of contents The market for specimens should be regulated not banned The monarcho-military establishment must give Thai voters the change they demand Digital payments have transformed domestic finance. Now competition is going global Published since September 1843 to take part in a severe contest between intelligence which presses forward and an unworthy timid ignorance obstructing our progress. Copyright The Economist Newspaper Limited 2023 . All rights reserved.
891
BAD
Americans are poorly served by their grocery stores (economist.com) A mericans have long been proud of their supermarkets. The first grocer with food in aisles to be picked up by shoppers rather than kept behind a counter opened in Memphis Tennessee in 1916. In Bentonville the hometown of Walmart in north-west Arkansas Americans flock to a bombastic museum celebrating the firms founder Sam Walton and his commitment to bringing low prices to underserved rural communities. Your browser does not support the <audio> element. Today Americans spend a smaller share of their income on food to be cooked at homeabout 6%than people in almost any other rich country. Low prices are reflected in low profit margins. At Kroger Americas second-biggest supermarket chain the net margin after taxes is about 1.2%; at Walmart the biggest it is 2.3%. Yet Americans are finding themselves served less and less well by their supermarkets. Food prices have risen by 11% over the past year which is sharply more than in other rich economies. As customers change their shopping habits in response that in turn is shaking up the grocery business. A fall in the share prices of big supermarket firms last month as several reported they had overestimated demand and stocked too much hints at the effect. Even before the current bout of inflation food prices in America had been rising faster than most other prices for the previous 20 years. A study in 2017 by the un s Food and Agriculture Organisation found that the cost of eating healthily in America was 65% more than in Britain and among the highest in the rich world (see chart). The inflation figures suggest that this will if anything have worsened since. Though Americans still spend a smaller proportion of their income on supermarket food than Europeans the gap has been narrowing. In absolute terms they spend more even though they also eat out more and eat less healthy cheaper foodstuffs. Why are American consumers not getting a better deal? A transatlantic comparison is revealing. Walmart which accounts for about 26% of the American market has a gross margin (its profit before fixed costs like rent and labour are taken into account) of about 25%. For Tesco Britains largest chain which has 27% of its home market the equivalent figure is 8%. Since the two firms both have low net margins (overall profitability) this suggests that Walmart has higher fixed costs and has to charge a big mark-up. American retail is high cost high touch says Simeon Gutman of Morgan Stanley a bank: customer experience is prioritised over price. Walmarts shops are enormous selling plenty besides groceries. Tesco operates over 4000 shops the bulk of which are its smallest. Walmart has only 4700 shops in total despite serving a population about five times larger. On average its supermarkets are ten times larger by floor space than Tescos. Bigger stores increase consumer choice. A Walmart supercentre might stock 140000 different items compared with just 40000 at Tescos biggest branches. But it also means they may be wasting a lot of space by stocking products that do not sell adding to costs. American supermarkets sell far less per square foot of shop space than British supermarkets notes Bryan Roberts a consultant based in London. Americans are used to perks such as having someone to bag their groceries for them. There is an almost old-school service that has been a part of the model that hasnt really existed in Europe for some time says Simon Johnstone of Kantar a research firm. But this does not come at no cost. Cheaper own-brand products have penetrated less making up less than a fifth of sales by value compared to more than 40% in Europe. We know that American supermarkets have to start cutting down the assortment we know that they have to start moving to more automated processes says Mr Johnstone. Another problem is that Americas supermarkets despite often lacking competition locally are less concentrated nationally. Regional firms may have the power to squeeze customers but they do not have the size that European supermarkets have to resist price increases by suppliers. Their suppliers are raking it in. Last month Tyson Foods Americas biggest meat supplier announced its latest quarterly profits were 74% higher than in the same period last year. Meat prices have spiked so much that in January the Biden administration announced a plan to try to boost competition in the industry. For retailers however sharper competition is now arriving. Last month Walmarts chief financial officer Brett Biggs told cnbc a broadcaster that its customers are increasingly looking for cheaper products squeezing margins which in turn squeeze the firms profits. Shops such as Costco and Walmarts Sams Club where customers can buy in bulk more cheaply are thriving. Among the supermarket chains expanding fastest in America is Aldi a German discounter which has more than 2000 shops. They are Spartan but affordable. Another is Trader Joes a Californian company now owned by the family of Aldis founder famous for its $2 range of wine (Two Buck Chuck). Both operate smaller supermarkets typically around 15000 square feet (1400 square metres) with more limited ranges and far more private-label products that cost much less on shelves that are refreshed more often. Customers adore them. Amazon as well as owning Whole Foods an upmarket grocer now runs 29 Fresh supermarkets mostly in suburbs of large cities. At these stores shoppers need not use a till at all instead being automatically charged as they pick items off the shelves and walk out. Until recently home delivery of groceries in America (unlike Europe) has often relied on gig-economy workers walking round ordinary supermarketsa costly model. But that is also changing. Walmarts head of e-commerce Tom Ward is British and has brought ideas pioneered in Britain such as dark stores where goods are packed exclusively for delivery. In Arkansas Walmart has even experimented with dispatching groceries by drone. On May 24th it announced that the programme is expanding to five more states. How much these ideas will change how Americans shop is unclear. Cutting product lines is often unpopular; big cars and big fridges still support a model of big grocery stores with surprisingly high prices. Still as consumers fret about inflation the pressure on American supermarkets to innovate and cut costs is bound to grow. For exclusive insight and reading recommendations from our correspondents in America sign up to Checks and Balance our weekly newsletter. This article appeared in the United States section of the print edition under the headline The not-so-super market Discover stories from this section and more in the list of contents A constitutional workaround is plausiblebut a deal would be better How Donald Trumps changes to the Bureau of Land Management are still slowing the energy transition Americas officials dont respect each others borders. Congress needs to step in Published since September 1843 to take part in a severe contest between intelligence which presses forward and an unworthy timid ignorance obstructing our progress. Copyright The Economist Newspaper Limited 2023 . All rights reserved.
894
BAD
Americans have never been so unwilling to relocate for a new job (bloomberg.com) To continue please click the box below to let us know you're not a robot. Please make sure your browser supports JavaScript and cookies and that you are not blocking them from loading. For more information you can review our Terms of Service and Cookie Policy . For inquiries related to this message please contact our support team and provide the reference ID below.
896
BAD
Americas banks are missing hundreds of billions of dollars (economist.com) I t is easy to understand how money gets destroyed in a traditional bank run. Picture the men in top hats yelling at clerks in Mary Poppins. The crowds want their cash and bank tellers are trying to provide it. But when customers flee staff cannot satisfy all comers before the institution topples. The remaining debts (which for banks include deposits) are wiped out. Your browser does not support the <audio> element. This is not what happens in the digital age. The depositors fleeing Silicon Valley Bank ( svb ) did not ask for notes and coins. They wanted their balances wired elsewhere. Nor were deposits written off when the bank went under. Instead regulators promised to make svb s clients whole. Although the failure of the institution was bad news for shareholders it should not have reduced the aggregate amount of deposits in the banking system. The odd thing is that deposits in American banks are nevertheless falling. Over the past year those in commercial banks have sunk by half a trillion dollars a drop of nearly 3%. This makes the financial system more fragile since banks must shrink to repay their deposits. Where is the money going? The answer starts with money-market funds low-risk investment vehicles that buy short-term government and corporate debt. These saw inflows of $121bn last week as svb failed. However money does not actually enter such vehicles for they are unable to take deposits. Instead cash that leaves a bank for a money-market fund is credited to the funds bank account from which it is used to purchase the commercial paper or short-term debt in which the fund invests. When the fund uses money in this way it flows to the bank account of whichever institution sells the asset. Inflows to money-market funds should thus shuffle deposits around the banking system rather than force them out of it. And that is what used to happen. Yet there is one obscure way in which money-market funds may suck deposits from the banking system: the Federal Reserves reverse-repo facility which was introduced in 2013. The scheme was a seemingly innocuous change to the financial systems plumbing that may a decade later be having a profoundly destabilising impact on banks. Read more on this topic How deep is the rot in Americas banking industry? For markets Silicon Valley Banks demise signals a painful new phase The Fed smothers capitalism in an attempt to save it In a usual repo transaction a bank borrows from competitors or the central bank and deposits collateral in exchange. A reverse repo does the opposite. A shadow bank such as a money-market fund instructs its custodian bank to deposit reserves at the Fed in return for securities. The scheme was meant to aid the Feds exit from ultra-low rates by putting a floor on the cost of borrowing in the interbank market. After all why would a bank or shadow bank ever lend to its peers at a lower rate than is available from the Fed? But use of the facility has jumped in recent years owing to vast quantitative easing ( qe ) during covid-19 and regulatory tweaks which left banks laden with cash . qe creates deposits: when the Fed buys a bond from an investment fund a bank must intermediate the transaction. The funds bank account swells; so does the banks reserve account at the Fed. From the start of qe in 2020 to its end two years later deposits in commercial banks rose by $4.5trn roughly equal to the growth in the Feds own balance-sheet. For a while banks could cope with the inflows because the Fed decided at the start of covid to ease a regulation known as the supplementary leverage ratio ( slr ). This stopped the growth in commercial banks balance-sheets from forcing them to raise more capital allowing them to safely use the inflow of deposits to increase holdings of Treasury bonds and cash. Banks duly took the opportunity buying $1.5trn of Treasury and agency bonds. Then in March 2021 the Fed let the exemption from the slr lapse. As a result banks found themselves swimming in unwanted cash. They shrank by cutting their borrowing from money-market funds which instead chose to park their cash at the Fed. By 2022 the funds had $1.7trn deposited overnight in the Feds reverse-repo facility compared with a few billion a year earlier. After the fall of svb Americas small and midsized banks fear deposit outflows. The problem is that monetary tightening has made them still more likely. Gara Afonso and colleagues at the Federal Reserve Bank of New York find that use of money-market funds rises along with rates since returns adjust faster than those from bank deposits. Indeed the Fed has raised the rate on overnight-reverse-repo transactions from 0.05% in February 2022 to 4.8% making it much more alluring than the going bank-deposit rate of 0.4%. The amount money-market funds parked at the Fed through the reverse-repo facilityand thus outside the banking systemjumped by half a trillion dollars in the same period. For those lacking a banking licence leaving money in the repo facility is a better bet than leaving it in a bank. Not only is the yield considerably higher but there is simply no reason to worry about the Fed going bust. Money-market funds could in effect become narrow banks: institutions that back consumer deposits with central-bank reserves rather than higher-return but riskier assets. A narrow bank cannot make loans to firms or write mortgages. Nor can it go bust. The Fed has long been sceptical of such institutions fretting that they would undermine banks. In 2019 officials denied tnb usa a startup aiming to create a narrow bank a licence. A similar concern has been raised about opening the Feds balance-sheet to money-market funds. When the reverse-repo facility was set up Bill Dudley then the president of the New York Fed worried it could lead to the disintermediation of the financial system. During a financial crisis it could exacerbate instability with funds running out of riskier assets and onto the Feds balance-sheet. There is no sign yet of a dramatic rush. For now the banking system is dealing with a slow bleed. But deposits are growing scarcer as the system is squeezedand Americas small and midsized banks could pay the price. Read more from Free exchange our column on economics: The Fed smothers capitalism in an attempt to save it (Mar 16th) Emerging-market central-bank experiments risk reigniting inflation (Mar 9th) The case against Google hinges on an antitrust mistake (Mar 2nd) Also: How the Free exchange column got its name This article appeared in the Finance & economics section of the print edition under the headline The missing half-trillion Discover stories from this section and more in the list of contents The scandal-ridden benchmark is a relic of a previous era The country appears to be stuck in a confidence trap What will it look like after the next spell of instability? Published since September 1843 to take part in a severe contest between intelligence which presses forward and an unworthy timid ignorance obstructing our progress. Copyright The Economist Newspaper Limited 2023 . All rights reserved.
897
GOOD
Americas first high-volume PFAS Annihilator is up and running in W. Michigan (woodtv.com) WOODTV.com Please enter a search term. Please enter a search term. by: Matt Jaworowski Posted: May 4 2023 / 04:30 AM EDT Updated: May 4 2023 / 12:00 PM EDT by: Matt Jaworowski Posted: May 4 2023 / 04:30 AM EDT Updated: May 4 2023 / 12:00 PM EDT WYOMING Mich. (WOOD) After five years of development a four-way business partnership has created the nations first high-volume closed-loop PFAS destruction system and its right here in West Michigan. Heritage-Crystal Clean runs the system out of its wastewater treatment facility in Wyoming using technology developed and manufactured by Battelle spin-off Revive Environment Allonnia and EPOC Enviro . The long-lasting chemical compounds have made headlines for years in West Michigan as more contamination sites are found and scientists learn more about the health impacts of the chemical compounds. Per- and polyfluoroalkyl substances are a large group of compounds first developed in the 1940s and incorporated into all sorts of products for waterproofing and heat resistance. Decades later research showed that PFAS compounds take a long time to break down organically and can build up in the human body causing serious health problems including cancer. The chemical compounds are so widespread that the Centers for Disease Control and Prevention estimates that 99% of all Americans have a detectable level of PFAS in our bodies. The materials are still manufactured and used. As they are used or thrown away those chemical compounds end up in our landfills and biosolids. Without treatment those PFAS compounds recycle through the ecosystem again in new materials or our groundwater even in food. This system stops that cycle by destroying the compounds. Revive Environmental President and CEO David Trueba boiled the process down into three main steps. The facility takes in landfill leachate essentially rainwater that filters through landfills collecting chemicals and other contaminants. The raw leachate is pushed through three treatments that separates the PFAS compounds from the leachate. That concentrated material is then run through the PFAS Annihilator which uses super-critical water oxidation or SCWO to break the extremely durable PFAS chemical bonds. The end result is clean water and salts usually sodium or potassium. The water is then sent to public water treatment facilities to be put back into the water system. The partnership is called 4never. It is a play on words for the phrase forever chemicals a common reference for PFAS because the compounds can take years to break down naturally. Trueba considers the 4never partnership a model to be built upon to truly tackle PFAS contamination. I was at a company where we did a lot of great work with media handling. We could (concentrate) the PFAS and the materials but that media had a solid waste factor. I had to send it to a landfill a depot injection or an incinerator. That recycles PFAS Trueba explained to News 8. This PFAS loop was never fully broken. Now for the first time commercially in North America Heritage-Crystal Clean has invested in the technology. With the Annihilator and this SAFF technology we can dispose and eliminate that recycle chain in its tracks. SCWO is not a new concept but using it to destroy PFAS is a recent endeavor. Trueba says the project started in earnest in 2018 one year after major PFAS contamination was discovered in northern Kent County connected to Wolverine Worldwide and the House Street Dump. There was a realization in 2018 with the first real governmental regulatory action and litigation awards that PFAS is not only a near-term problem but the ubiquitous and large challenge Trueba said. So the leadership at Battelle decided we are going to put a lot of investment in (research and development). During that research phase Battelle settled on super-critical water oxidation as the path forward. What this does is it combines high temperature with high pressure and that environment creates whats called super-critical water which essentially means instead of a gas or a liquid it is both at the same time Trueba explained. Oxygen is completely dissolvable in that solution and all of the chemicals that would normally be non-compatible with water like oils and greases and organics are fully soluble. That allows the oxygen to attack the carbon-fluorine bonds and complete destroy or annihilate the material in 10 to 30 seconds. Heritage-Crystal Clean President and CEO Brian Recatto said PFAS really came onto his radar in 2020. Potential customers needed help with leachate and they wanted to know if we could handle PFAS compounds because they were starting to get pushback from publicly owned treatment centers. We obviously said no because we didnt have any treatment capabilities for PFAS compounds Recatto told News 8. Recatto eventually stumbled onto an article written by a Battelle scientist about SCWO. He cold called the company trying to learn more and eventually got the ball rolling. We had a super meeting about how we could put this partnership together with our waste treatment facility. That kind of started the dialogue and a year later we did the pilot he said. The first PFAS Annihilator was tested in 2022 at the Wyoming facility treating just five gallons at a time. As a proof of concept the pilot program was a huge success. The past year has been spent on the logistics end expanding the size of the process to make it cost-effective. Recatto says the plant currently takes in approximately 100000 gallons of raw landfill leachate each day approximately 20 truckloads. That raw leachate generates around 100 gallons of concentrated PFAS material. The leachate currently comes from three local landfills and varies depending on rainfall. Heritage-Crystal Cleans goal is to handle 200000 gallons of raw leachate daily by the end of May. The ultimate goal at least for the current system in place is 500000 gallons per day. We can do 500000 gallons a day. It could probably go up to 600000 but thats with some optimization and manpower as we get more stable. But we are designing units for even larger capacity Trueba said. With the first system up and running there are two new goals for 4never: tweaking and developing new technology and putting more of these systems to work. Amy Dindal the PFAS Program Manager for Battelle says her team is turning their attention to nonliquid materials. Right now the technology is tuned in on an aqueous or liquid basis so things like leachate or other liquid matrices. Were now working on slurries and solids to be able to destroy PFAS in things like soil and other solid materials Dindal said. Recatto said Heritage-Crystal Clean has expansion plans of its own but the company is committed to the partnership as well. Weve got additional equipment arriving as we speak. Weve got two units out in the parking lot now that we are going to move to other locations. We have 11 of these plants scattered around North America with the hope that we will have 20 of them in the next couple of years Recatto said. We are going to follow where the customers take us. If (Battelle) needs logistics help well go help them in other places. We have labor we have equipment to move their equipment. And if (Battelle) lands a project with the (Department of Defense) to treat (aqueous film forming foam linked to firefighting foam at air bases) and needs our help well certainly help. Copyright 2023 Nexstar Media Inc. All rights reserved. This material may not be published broadcast rewritten or redistributed. Looking for a fun hands-on activity to do with your kids this summer? Try putting together one of these DIY birdhouse or bird feeder kits! You can take your home decor to the next level with Auras Digital Picture Frames. Heres why theyre worth the hype. Memorial Day is a great time to stock up on home decor furniture and more for the summer. Check out these amazing deals from Home Depot before theyre gone! Thanks for signing up! Watch for us in your inbox. Subscribe Now Know something newsworthy? Report It !
899
BAD
An Interactive Introduction to Fourier Transforms (2019) (jezzamon.com) Jez Swanson Fourier transforms are a tool used in a whole bunch of different things. This is an explanation of what a Fourier transform does and some different ways it can be useful. And how you can make pretty things with it like this thing: I'm going to explain how that animation works and along the way explain Fourier transforms! By the end you should have a good idea about We're going to leave the mathematics and equations out of it for now. There's a bunch of interesting maths behind it but it's better to start with what it actually does and why you'd want to use it first. If you want to know more about the how there's some further reading suggestions below! Put simply the Fourier transform is a way of splitting something up into a bunch of sine waves. As usual the name comes from some person who lived a long time ago called Fourier. Lets start with some simple examples and work our way up. First up we're going to look at waves - patterns that repeat over time. Heres an example wave: This wavy pattern here can be split up into sine waves. That is when we add up the two sine waves we get back the original wave. The Fourier transform is a way for us to take the combined wave and get each of the sine waves back out. In this example you can almost do it in your head just by looking at the original wave. Why? Turns out a lot of things in the real world interact based on these sine waves. We usually call them the wave's frequencies. The most obvious example is sound when we hear a sound we dont hear that squiggly line but we hear the different frequencies of the sine waves that make up the sound. Play Full Wave Play High Frequency Play Low Frequency Being able to split them up on a computer can give us an understanding of what a person actually hears. We can understand how high or low a sound is or figure out what note it is. We can also use this process on waves that don't look like they're made of sine waves. Let's take a look at this guy. Its called a square wave. It might not look like it but it also can be split up into sine waves. We need a lot of them this time technically an infinite amount to perfectly represent it. As we add up more and more sine waves the pattern gets closer and closer to the square wave we started with. Play Wave Drag the slider above to play with how many sine waves there are. Visually you'll notice that actually the first few sine waves are the ones that make the biggest difference. With the slider halfway we have the general shape of the wave but it's all wiggly. We just need the rest of the small ones to make the wigglyness flatten out. When you listen to the wave you'll hear the sound get lower because we're removing the higher frequencies. This process works like that for any repeating line. Give it a go try drawing your own! Draw here! Play Wave Move the slider to see how as we add more sine waves it gets closer and closer to your drawing Again aside from the extra wigglyness the wave looks pretty similar with just half of the sine waves. We can actually use the fact that the wave is pretty similar to our advantage. By using a Fourier transform we can get the important parts of a sound and only store those to end up with something that's pretty close to the original sound. Normally on a computer we store a wave as a series of points. What we can do instead is represent it as a bunch of sine waves. Then we can compress the sound by ignoring the smaller frequencies. Our end result won't be the same but it'll sound pretty similar to a person. This is essentially what MP3s do except they're more clever about which frequencies they keep and which ones they throw away. So in this case we can use Fourier transforms to get an understanding of the fundamental properties of a wave and then we can use that for things like compression. Ok now let's dig more into the Fourier transform. This next part looks cool but also gives you a bit more understanding of what the Fourier transform does. But mostly looks cool. Now at the start I said it splits things into sine waves. The thing is the sine waves it creates are not just regular sine waves but theyre 3D. You could call them complex sinusoids. Or just spirals. If we take a look from the side they look like sine waves. From front on though these look like circles. So far everything weve been doing has only required the regular 2D sine waves. When we do a Fourier transform on 2D waves the complex parts cancel out so we just end up with sine waves. But we can use the 3D sine waves to make something fun looking like this: Whats going on here? Well we can think of the drawing as a 3D shape because of the way it moves around in time. If you imagine the hand being drawn by a person the three dimensions represent where the tip of their pencil is at that moment. The x and y dimensions tell us the position and then the time dimension is the time at that moment. Now that we have a 3D pattern we can't use the regular 2D sine waves to represent it. No matter how many of the 2D sine waves we add up we'll never get something 3D. So we need something else. What we can use is the 3D spiral sine waves from before. If we add up lots of those we can get something that looks like our 3D pattern. Remember these waves look like circles when we look at them from front on. The name for the pattern of a circle moving around another circle is an epicycle. Use the slider above to control how many circles there are. Like before we get a pretty good approximation of our pattern with just a few circles. Because this is a fairly simple shape all the last ones do is make the edges a little sharper. All this applies to any drawing really! Now its your chance to play around with it. Draw here! Use the slider to control how many circles are used for your drawing Again you'll see for most shapes we can approximate them fairly well with just a small number of circles instead of saving all the points. Can we use this for real data? Well we could! In reality we have another data format called SVG which probably does a better job for the types of shapes we tend to create. So for the moment this is really just for making cool little gifs. There is another type of visual data that does use Fourier transforms however. Did you know Fourier transforms can also be used on images? In fact we use it all the time because that's how JPEGs work! We're applying the same principles to images splitting up something into a bunch of sine waves and then only storing the important ones. Now we're dealing with images we need a different type of sine wave. We need to have something that no matter what image we have we can add up a bunch of these sine waves to get back to our original image. To do that each of our sine waves will be images too. Instead of a wave that's a line we now have images with black and white sections. To represent the size of a wave each image will have more or less contrast. We can also use these to represent color in the same way but let's start with black-and-white images for now. To represent colorless images we need some horizontal wave images Along with some vertical wave images. By themselves just horizontal and vertical images aren't enough to represent the types of images we get. We also need some extra ones that you get by multiplying the two together. For an 8x8 image here are all the images we need. If we take the images adjust their contrast to the right amount and then add them up we can create any image. Let's start with this letter 'A'. It's pretty small but we need it to be small otherwise we'll end up with too many other images. As we add more and more of these images we end up with something that becomes closer and closer to the actual image. But I think you'll see the pattern here as we get a reasonable approximation with just a few of them. For actual JPEG images there are just a few extra details. The image gets broken up into 8x8 chunks and each chunk gets split up separately. We use a set of frequencies to determine how light or dark each pixel is and then another two sets for the color one for red-green and another for blue-yellow. The number of frequencies that we use for each chunk determines the quality of the JPEG. Here's a real JPEG image zoomed in so we can see the details. When we play with the quality levels we can see this process happen. So let's recap: This is just scratching the surface into some applications. The Fourier transform is an extremely powerful tool because splitting things up into frequencies is so fundamental. They're used in a lot of fields including circuit design mobile phone signals magnetic resonance imaging (MRI) and quantum physics! I skipped most of the math stuff here but if you're interested in the underlying principles of how it works here are some questions you can use to guide your research: To learn more some really good resources you can check out are: An Interactive Guide To The Fourier Transform A great article that digs more into the mathematics of what happens. But what is the Fourier Transform? A visual introduction. A great Youtube video by 3Blue1Brown also explaining the maths of Fourier transforms from an audio perspective. A Tale of Math & Art: Creating the Fourier Series Harmonic Circles Visualization Another article explaining how you can use epicycles to draw a path explained from a linear algebra perspective. Fourier transform (Wikipedia) And of course the Wikipedia article is pretty good too. I'm Jez! Full time I work at a search company in the Bay Area and in my spare time I like making games and interactive code things like this! This webpage is open-source you can check out the code on GitHub ! If you have any feedback or want to ask any questions feel free to email me at fourier [at] jezzamon [dot] com or shoot me a tweet on Twitter . If you want to see more of my work check out my homepage and if you want to see what I'm making next you can follow my Twitter account @jezzamonn !
958
BAD
An account was suspended (twitter.com/benthompson) Weve detected that JavaScript is disabled in this browser. Please enable JavaScript or switch to a supported browser to continue using twitter.com. You can see a list of supported browsers in our Help Center. Help Center Terms of Service Privacy Policy Cookie Policy Imprint Ads info 2023 X Corp.
913
BAD
An epic Nessun Dorma that leaves a Royal Albert Hall audience in awe (classicfm.com) Classic FM Sam Pittis 4am - 7am Wine woman & song Opus 333 Johann Strauss (II) Composers Puccini 15 May 2023 20:11 | Updated: 19 May 2023 15:18 An epic 'Nessun dorma' from opera star Michael Spyres at Classic FM | Classic FM By Kyle Macdonald One of the great singers of our time brings Puccinis timeless opera masterpiece to an iconic concert hall. Its a performance not many will ever forget. Vincer! or I will win! its the famous and always moving climax to the aria from Puccinis opera Turandot . The aria is one of those very special moments of music that has gripped and enchanted millions over the years. It found truly global fame in the 1990s thanks to Luciano Pavarotti Italia 90 and Three Tenors CDs that were on virtually everyones shelf. Its drama and power combined with that enduring public appeal makes it a natural show-stopper in live performances. And so it was on this night before 5000 people at Londons Royal Albert Hall during an opera-themed Classic FM Live with Viking . But theres always more than just those high notes at the very end of the aria. And on that night there was a singer perfectly matched to reveal every glorious moving moment of Nessun Dorma. Watch above. Read more: What are the lyrics to Puccinis Nessun Dorma? Michael Spyres is an American opera singer. Hes rare for the fact that he can sing in both baritone and tenor voice types. His range spans from the rich and resonant baritone register up to the highest notes of most agile Bel Canto tenor. Alongside the Puccini he also gave us a Rossini baritone aria that night. Its one of the reasons why this performance was so special. Puccini sets the scene with hushed orchestra on this occasion it played by English National Opera and conductor Paul Daniel. The singer then enters pleading Nessun dorma first in the middle of his register and then low. Then that powerful lyricism starts. This is all perfect territory for Spyres to show off that sonority and voice. From these opening cries to the final Vincer! on that winning high B its a journey like few others in music. Spyres held that hall in rapt silence before everyone erupted. Nessun dorma does it every time. Michaels latest album on Erato Contratenor is available now. Relive the magic of Classic FM Live on Sky Arts soon. See more Puccini latest Discover Music Luciano Pavarotti Alfie Boe Discover Music Luciano Pavarotti Luciano Pavarotti See more Best classical music Discover Music Discover Music Discover Music Discover Music Discover Music Discover Music
938
BAD
An experimental beach barrier could be key to rebuilding eroding coastlines (hakaimagazine.com) Share this: This article was originally published in High Country News a magazine about the American Wests environment and communities and is reproduced here as part of the Climate Desk collaboration. Read more stories like this at hcn.org . David Cottrell stands on what used to be a four-meter-high cliff at the crumbled end of Blue Pacific Drive. Just a few years ago this was the fastest-eroding shoreline on the Pacific Coast of the United States; locals here in North Cove Washington State dubbed it Washaway Beach. But as Cottrell walks toward the water on a sunny November morning he steps not off a cliff but onto soft dry sand. Thigh-high dune grasses sprawl in all directions. The low tide laps at a flock of sandpipers a short distance away. Cottrell a cranberry farmer and local drainage commissioner holds up a laminated map pointing to our location. During his childhood this was part of a dense beachside neighborhood but the tides have swept most of it awaya complex phenomenon related to dams and jetties that have changed the flow of sediments. Where were standing right now we were losing 50 to 100 feet [about 15 to 30 meters] a year he says. All told North Cove has lost more than 10 square kilometers of land plus a lighthouse a cannery and 160 structures. By 2015 many residents had given up on saving their town. Facing predictions of continued erosion agencies had begun talk of moving Highway 105 away from the coasta loss that could doom this isolated rural community. An essential transportation artery the highway serves as the dike that protects some 324 hectares of historic bogs where Cottrell and other farmers grow more than half the states cranberries. Cottrell felt he had to try something. We had absolutely nothing to lose he says. So in 2016 Cottrell dropped US $400 worth of rocks from the end of this roadone load right off the end just to see what would happen. He sought to mimic the cobble beaches and basalt slides that are common in the Pacific Northwest. That experiment has since grown into a more than two-kilometer-long berm of rocks and stumps that shifts with the waves and collects sand rebuilding the beach. As a result much of this coastline has held putting North Cove at the forefront of a global shift in how communities protect their coastlines as sea levels rise. Engineerswho have long depended on rigid sea wallsare now closely watching this softer approach. North Coves solution which resembles the techniques many Indigenous communities use to cultivate shellfish looks less like the conventional structures engineers know and more like the dunes and berms that centuries of storms and tides build on their own. David Cottrell surveys Washaway Beach Washington State in November 2022 where his communitys experiments with piles of loose cobble have transformed the shoreline. Whenever he walks here he grabs fistfuls of seed heads to scatter. Photo by Sarah Trent/ High Country News Cottrell stands in the salty breeze wearing his signature black Carhartt jacket. On the back hand-painted letters read Washaway No More. Most days he walks the beach troubleshooting the remaining hotspots with landowners and explaining the still-evolving project to visitors. The people that get this best are surfers and Buddhists Cottrell had told me earlier. In a situation thats in constant flux what you want to do is position yourself to go with it. North Cove was built on land near the Columbia River outlet that has always been at the mercy of intense waves El Niodriven storms tidal currents flowing sediment and tangles of driftwood. Over millennia these forces built a long sandy spit at the mouth of Willapa Bay just south of North Cove. Storms swept sand away each winter then currents replenished it each summeruntil they didnt Cottrell says for reasons scientists are only beginning to understand. Maps show that the trend had started by the early 1900s; researchers believe a series of jetties and the 1930s damming of the Columbia River which both changed sediment flow in the region contributed to it. Over decades the spit was whittled down to a nub. The rising tides and intensifying storms of climate change only hastened its undoing. That collision of forces made Washaway Beach a terrible candidate for any protective efforts says George Kaminsky a coastal engineer at the Washington State Department of Ecology. But since Cottrell couldnt make anything worse he decided to try something unorthodox setting the stage for an experiment whose results global experts including Kaminsky are now researching. After Cottrell dropped that first load of rock nature took over: when waves hit the pile the water spread out instead of smashing against the steep eroded bank. Stones migrated and settled. Sand collected in between. Dune grasses and sand verbenas help to stabilize the sand that collects on the berm. Photo by Sarah Trent/ High Country News This galvanized the community and in 2016 a group led by Charlene Nelson chairwoman of the nearby Shoalwater Bay Tribe expanded the project. Using a$600000 state grant they made a scrappy version of what engineers call a dynamic revetment: a long cobble berm along the top of the beach. Using the cheapest unsorted rock they could find they dumped piles along more than a kilometer and a half of bank letting the waves sort them into place. Then lower down near the highest average waterline they spread the same jagged cobbles into a speed bump just under a meter tall. Together these structures build back the beach: as waves trip over the speed bump and slosh through the berm they slow and drop sand. The first year both were in place the beach near this road-end grew by about 15 meters. The next year it kept growing. As climate change progresses coastal communities nearly everywhere are searching for solutions. Hard barriers like sea walls and riprap wont cut it in many places; they do block water but often cause further erosion. Theyre also so expensive that few can afford them. US climate models show sea level rise locked in at around 30 centimeters on average nationwide by 2050. In Washington State alone that is forecast to cause billions of dollars in damage. By 2100 the state expects catastrophic land loss including 44 percent of tidal flats and 65 percent of estuarine beaches at key sites along the coastplaces that myriad coastal species including humans rely on for food and protection. Coastal resilience experts believe building beaches back could be enough to prevent some of this. Kaminskys research on the berm has already influenced projects nearby and in California Europe and Guam. Together these experiments promise to transform the tools that agencies and communities can apply elsewhere. To create any protective structure engineers need design standards. The data to establish them didnt exist until communities like North Cove started trying. If youve not been out here its hard to wrap your brain around whats really going on says Lauren Bauernschmidt a state Department of Fish and Wildlife biologist standing on loose cobble. After working with Cottrell for five years she is due to issue him a new maintenance permit and needs her bosss sign-off. She and Cottrell are also trying to drum up more funding and buy-in from the many agencies involved so they have assembled a cadre of colleagues to bring them up to speed. On this breezy blue-sky morning near the roads end the once-threatening waterline is far offshore. The speed bump Cottrell tells the group is buried under almost a meter of sand. Clam beds long absent have returned along with grasses and shorebird habitat. Even when winter storms pull sand awaythe way of things herethe cobble remains to restart the beach-building process. Now that this section of shore seems stable Cottrell says my hope is that this is hands-off forever. But down the beach trouble spots remain. Farther south the beach narrows until it reaches a prominent finger of landa single home atop itthat has so far defied the tides. Surrounded by a sea wall of giant boulders it has become a landmark at the center of this project. Even the days gentle waves deflect off the wall toward the banks beside it. Stronger ones have carved deeply into the adjacent shoreline threatening to turn the point into an island: a reminder of the pitfalls of bulwark structures in a naturally ephemeral environment. The worst erosion is on the southern side. There a vertical cliff edge flanks a narrow curve of beach. Over the previous year seven spruce trees on that neighboring property lost their footing toppling into the surf. An eighth leans ominously. This vulnerable strip of land owned by Ed Borden has become a linchpin for North Cove. From here to the highway is about 400 feet [122 meters] Cottrell says. That could go in one or two nights in a big storm. With it would go the roadway homes and cranberry bogs behind it. Cottrell hopes to drop more cobble around the wall to re-establish a beach which would slow the waves or even prevent them from reaching the sea wall. At the edge of his land Borden stacks hay bales with a mini excavator hoping they too might slow the oceans inland creep. Throughout the year Borden and Cottrell dumped thousands of tonnes of cobble along this bank but the wash off the sea wall was too strong. Despitemaybe because ofits impact here that wall remains a seductive solution. Borden eyes the fortress which stands deceptively steady. He isnt sure yet about the small cobbles; he has yet to see whether they worked as planned. I need a bigger excavator bigger rock he starts to explain. Or we could get you your sand beach back Cottrell counters glancing to the surf. Nothing dissipates wave energy like a good beach. Cite this Article: Washaway No More: An Experimental Beach Barrier Could Be Key to Rebuilding Eroding Coastlines Hakai Magazine Feb 23 2023 accessed May 20th 2023 https://hakaimagazine.com/features/washaway-no-more-an-experimental-beach-barrier-could-be-key-to-rebuilding-eroding-coastlines/ . Oct 16 2017 | 950 words about 4 minutes Jul 19 2022 | 4800 words about 24 minutes Made next to the in Victoria Canada - ISSN 2371-5790 Part of the Tula Foundation and Hakai Institute family.
945
BAD
An overview of modern Japanese wood construction (2004) [pdf] (forum-holzbau.com)
986
GOOD
An overview of single-purpose Linux distributions (lwn.net) Subscribers to LWN.net made this article and everything that surrounds it possible. If you appreciate our content please buy a subscription and make the next set of articles possible. February 14 2023 This article was contributed by Koen Vervloesem FOSDEM Many people when they are installing a Linux distribution for a single purpose such as running containers would prefer an install-and-forget type of deployment. At FOSDEM2023 in Brussels several projects of this minimal Linux distribution type were presented. Fedora CoreOS Ubuntu Core openSUSE MicroOS and Bottlerocket OS all tackle this problem in their own way. The talks at FOSDEM gave an interesting overview of how these projects differ in their approaches. Fedora CoreOS Akashdeep Dhar and Sumantro Mukherjee who are both members of the Fedora Council and work at Red Hat as software engineers explained how they use Fedora CoreOS as the base operating system to run multiplayer game servers in containers. As described in its documentation Fedora CoreOS is an automatically updating minimal monolithic container-focused operating system. Fedora CoreOS (sometimes abbreviated FCOS) provides the host operating system for these containers; it only includes those packages that are needed for a minimal networking-enabled and container-ready setup. At the time of this writing the latest stable release had415 packages. It supports the x86_64 aarch64 (including the Raspberry Pi 4 ) and s390x architectures; it runs on bare metal virtualized or on various cloud platforms. A Fedora CoreOS machine is provisioned using Ignition which is a tool that partitions disks formats partitions enables systemd units and configures users. Ignition only runs once during the first boot of the system from the initramfs. An Ignition configuration file is formatted as JSON but for end users Fedora CoreOS recommends using a Butane configuration which is a YAML file that Butane translates into an Ignition configuration. The System Configuration section in Fedora CoreOS's documentation shows some examples of how to configure storage network containers users and groups time zones and more in a Butane configuration. In their talk Dhar and Mukherjee showed a Butane configuration to set up a Minecraft server in a container and they also published it in their GitHub repository . When installing Fedora CoreOS you choose one of three update streams . Next is for experimenting with new features testing represents what is coming in the next stable stream and stable is the stream with changes that have spent a time in the testing stream. Most end users should choose the stable stream. You refer to the Ignition file with your customizations in a manner that depends on your installation type. For instance when installing from PXE you append the coreos.inst.ignition_url=URL option to the kernel referring it to the location of the Ignition file on a web server. After installation the system is updated automatically when a new release is rolled out on the chosen stream. The Zincati agent checks for operating-system updates and applies them using rpm-ostree . Zincati can be configured as well; for example one can configure how wary it is to update (that is how early in the phased rollout cycle it receives updates) and how eager it is to reboot after applying an update (immediately or only within configured maintenance windows). If an update causes problems the user is always able to manually roll back to the previous system state with: $ sudo rpm-ostree rollback -r There are no dnf or yum commands in Fedora CoreOS. Extending the package set is done with rpm-ostree which layers the packages on top of the current operating-system image. But since Fedora CoreOS is a container-focused system extra services would generally be installed as containers. Ubuntu Core Canonical's Valentin David talked about Ubuntu Core. According to the project's home page it's a secure application-centric IoT OS for embedded devices. Ubuntu Core targets high-end embedded devices such as industrial computers for IoT gateways signage robotics and automotive applications; at home it could be useful on a Raspberry Pi to run services such as Nextcloud or home-automation software. The distribution's software is based on Ubuntu's main operating-system builds but without using deb packages or the dpkg and apt commands. Instead it only uses snaps to install software. In essence a snap package is a squashfs image with some metadata about how to install and run the software. Snaps are isolated from other snaps and the underlying operating system. If a snap is run in strict confinement it runs in a sandbox making use of AppArmor seccomp and control groups. By default snaps don't have access to resources outside of the sandbox but they can get access to specific resources using interfaces . David explained that there are five types of snaps in Ubuntu Core. The gadget snap contains device-specific or architecture-specific components such as the boot loader device tree board-specific packages and configurations. The kernel snap comes with the Linux kernel modules firmware and systemd stubs. The base snap contains the root file system for the Ubuntu Core operating system. The snapd snap has snapd the daemon that installs and updates all snaps. And last but not least each application is packaged in an application snap. This includes the root file system of a base snap; it can also make services and commands available to the underlying operating system. The gadget snap also describes the disk layout. Ubuntu Core typically has four partitions . On UEFI systems the seed partition is the EFI System Partition (ESP) containing the configuration for the first-stage boot loader and at least one recovery system. The boot partition contains the second-stage boot loader a kernel and initramfs that decrypts the save and data partitions. The latter two are LUKS2 encrypted. The save partition contains a backup of the device identity and other data to facilitate recovery while the data partition stores the user and system data. Most of the Ubuntu Core operating system is read-only. For instance /etc and /var are read-only by default. However specific paths are bind-mounted from the data partition such as /etc/systemd which allows the system to change any systemd unit files for the services and mount the snap's squashfs image. Transactional updates are handled by snapd: if an update of a snap fails the system automatically rolls back to the previous version of the snap. openSUSE MicroOS Ignaz Forster research engineer at SUSE described the design of openSUSE MicroOS . It's a rolling-release distribution of openSUSE Tumbleweed developed to run as a single-purpose system. A typical target would be for hosting containers but it can even be used to create a minimal desktop . As with Fedora CoreOS and Ubuntu Core openSUSE MicroOS automatically updates itself and has a minimal package selection. In openSUSE MicroOS's case these are just RPM packages from openSUSE's repositories. There's also an enterprise version SUSE Linux Enterprise Micro and a community version based on the latter Leap Micro . OpenSUSE MicroOS has a read-only root file system using Btrfs. Transactional updates are handled by a SUSE-specific wrapper script around the package manager zypper transactional-update . This creates a new Btrfs snapshot of the root file system and then performs an update of the system. If the installation was successful the script marks the new snapshot as the default snapshot. On errors the snapshot is discarded and the previous one remains as the default. A reboot activates the new snapshot. Forster announced that since all of the read-only parts of openSUSE MicroOS have now been moved to /usr the upcoming 4.2.0 release of transactional-update would also be able to apply new snapshots without rebooting. MicroOS runs a health-checker systemd service that checks whether the system boots as expected after an update. This starts an automatic rollback to the previous default snapshot of the root file system if the system isn't healthy. The original transactional-update script has been evolving into a generic library for atomic system updates libtukit. The current implementation only supports Btrfs with openSUSE's snapshot utility Snapper as used in openSUSE MicroOS. But according to Forster the API is developed to support other backends. In contrast to Ubuntu Core all of /var and /etc are writable while /usr is read-only. For instance the default system configuration is put in /usr . Only changes made by the administrative user are in /etc . OpenSUSE's libeconf merges the configuration files placed in several locations. Most of the default MicroOS packages have been changed to work with this. Only /etc/fstab does not follow this convention yet. Forster concluded that openSUSE MicroOS takes a pragmatic approach to use existing infrastructure and packages and that it's a functional read-only OS in an imperfect world. Bottlerocket While the previous three operating systems originated from a general-purpose mother distribution Bottlerocket is created by Amazon tailored to host containers in its Amazon Web Services (AWS) cloud. In his talk AWS software development engineer Sean McGinnis was quick to emphasize that the operating system is backed by AWS but not AWS-only. As an example the project's GitHub repository has instructions to run it on bare-metal servers . Bottlerocket was announced in March 2020 and made generally available in August 2020 . To keep its footprint as small as possible Amazon publishes variants for particular use cases. For instance there's a aws-k8s variant with containerd and kubelet to run as a Kubernetes node on AWS a vmware-k8s variant to do the same on VMware with Amazon Elastic Kubernetes Service (EKS) and a metal-k8s variant that supports Amazon EKS running on bare metal. Bottlerocket runs two completely separate container runtimes. One is running host containers for operational tasks. The other one is used for running containers with an orchestrator such as Kubernetes pods. Both runtimes have different security profiles. Each container be it a host container or a for-orchestrator container is running an API client that talks over a Unix socket to an API server running on Bottlerocket. When Bottlerocket boots its boot configuration (including user data) is loaded into the API server. User interaction is also typically done through this API to make real-time changes to the system configuration. One of the host containers is the control container which is launched on boot. This container is used to configure the Bottlerocket host. Another host container is the admin container. This isn't launched by default: it should only be launched in exceptional circumstances to troubleshoot the host operating system. It has additional privileges and can use the root process namespace to access the other containers for troubleshooting purposes. The admin container runs an SSH server that is reachable through the host's primary network interface. A final type of host container is the bootstrap container: this bootstraps the host before services like Kubernetes or Docker start. It has additional permissions for instance to provide access to the underlying host file system. Security is one of the focal points of Bottlerocket. The root file system is read-only and /etc is backed by a tmpfs file system that is regenerated on boot. For container images and volumes a separate user partition is mounted. Moreover there's no package manager no shell and no Python interpreter. If an attacker is able to escape a container there are not many tools to work with McGinnis said. To check the integrity of the block devices Bottlerocket uses dm-verity . The kernel boots in lockdown mode which prevents the root user from modifying the kernel. McGinnis explained that this increases assurance that the running kernel corresponds to the booted kernel. Another security feature he emphasized is that Bottlerocket runs with SELinux in enforcing mode. For updates Bottlerocket uses an image-based model. The kernel system packages and container runtime packages are all stored inside an operating-system image. The first block device of the host has an active and inactive partition. An upgraded image is downloaded to the inactive partition and upon reboot the host boots into this partition which is then made active. The previous Bottlerocket image is still stored in the then inactive partition and can be rolled back if required. Conclusion When looking at the different approaches of these single-purpose Linux distributions it's clear that there's no one best way. Which one you choose depends on how they align to your goals and what tools you're comfortable with. Are you heavily invested in the API-first AWS or Kubernetes world? Then Bottlerocket seems to be the best fit. Do you prefer snaps to run your services? Then Ubuntu Core is a no-brainer. If you want to run containers on a host system without too much maintenance then Fedora CoreOS or openSUSE MicroOS are for you. Whether they are using rpm-ostree or Btrfs snapshots under the hood is probably less important when all of the workloads are running in containers anyway. Index entries for this article GuestArticles Vervloesem Koen Conference FOSDEM/2023 ( Log in to post comments) February 14 2023 This article was contributed by Koen Vervloesem Many people when they are installing a Linux distribution for a single purpose such as running containers would prefer an install-and-forget type of deployment. At FOSDEM2023 in Brussels several projects of this minimal Linux distribution type were presented. Fedora CoreOS Ubuntu Core openSUSE MicroOS and Bottlerocket OS all tackle this problem in their own way. The talks at FOSDEM gave an interesting overview of how these projects differ in their approaches. Akashdeep Dhar and Sumantro Mukherjee who are both members of the Fedora Council and work at Red Hat as software engineers explained how they use Fedora CoreOS as the base operating system to run multiplayer game servers in containers. As described in its documentation Fedora CoreOS is an automatically updating minimal monolithic container-focused operating system. Fedora CoreOS (sometimes abbreviated FCOS) provides the host operating system for these containers; it only includes those packages that are needed for a minimal networking-enabled and container-ready setup. At the time of this writing the latest stable release had415 packages. It supports the x86_64 aarch64 (including the Raspberry Pi 4 ) and s390x architectures; it runs on bare metal virtualized or on various cloud platforms. A Fedora CoreOS machine is provisioned using Ignition which is a tool that partitions disks formats partitions enables systemd units and configures users. Ignition only runs once during the first boot of the system from the initramfs. An Ignition configuration file is formatted as JSON but for end users Fedora CoreOS recommends using a Butane configuration which is a YAML file that Butane translates into an Ignition configuration. The System Configuration section in Fedora CoreOS's documentation shows some examples of how to configure storage network containers users and groups time zones and more in a Butane configuration. In their talk Dhar and Mukherjee showed a Butane configuration to set up a Minecraft server in a container and they also published it in their GitHub repository . When installing Fedora CoreOS you choose one of three update streams . Next is for experimenting with new features testing represents what is coming in the next stable stream and stable is the stream with changes that have spent a time in the testing stream. Most end users should choose the stable stream. You refer to the Ignition file with your customizations in a manner that depends on your installation type. For instance when installing from PXE you append the coreos.inst.ignition_url=URL option to the kernel referring it to the location of the Ignition file on a web server. After installation the system is updated automatically when a new release is rolled out on the chosen stream. The Zincati agent checks for operating-system updates and applies them using rpm-ostree . Zincati can be configured as well; for example one can configure how wary it is to update (that is how early in the phased rollout cycle it receives updates) and how eager it is to reboot after applying an update (immediately or only within configured maintenance windows). If an update causes problems the user is always able to manually roll back to the previous system state with: $ sudo rpm-ostree rollback -r There are no dnf or yum commands in Fedora CoreOS. Extending the package set is done with rpm-ostree which layers the packages on top of the current operating-system image. But since Fedora CoreOS is a container-focused system extra services would generally be installed as containers. Canonical's Valentin David talked about Ubuntu Core. According to the project's home page it's a secure application-centric IoT OS for embedded devices. Ubuntu Core targets high-end embedded devices such as industrial computers for IoT gateways signage robotics and automotive applications; at home it could be useful on a Raspberry Pi to run services such as Nextcloud or home-automation software. The distribution's software is based on Ubuntu's main operating-system builds but without using deb packages or the dpkg and apt commands. Instead it only uses snaps to install software. In essence a snap package is a squashfs image with some metadata about how to install and run the software. Snaps are isolated from other snaps and the underlying operating system. If a snap is run in strict confinement it runs in a sandbox making use of AppArmor seccomp and control groups. By default snaps don't have access to resources outside of the sandbox but they can get access to specific resources using interfaces . David explained that there are five types of snaps in Ubuntu Core. The gadget snap contains device-specific or architecture-specific components such as the boot loader device tree board-specific packages and configurations. The kernel snap comes with the Linux kernel modules firmware and systemd stubs. The base snap contains the root file system for the Ubuntu Core operating system. The snapd snap has snapd the daemon that installs and updates all snaps. And last but not least each application is packaged in an application snap. This includes the root file system of a base snap; it can also make services and commands available to the underlying operating system. The gadget snap also describes the disk layout. Ubuntu Core typically has four partitions . On UEFI systems the seed partition is the EFI System Partition (ESP) containing the configuration for the first-stage boot loader and at least one recovery system. The boot partition contains the second-stage boot loader a kernel and initramfs that decrypts the save and data partitions. The latter two are LUKS2 encrypted. The save partition contains a backup of the device identity and other data to facilitate recovery while the data partition stores the user and system data. Most of the Ubuntu Core operating system is read-only. For instance /etc and /var are read-only by default. However specific paths are bind-mounted from the data partition such as /etc/systemd which allows the system to change any systemd unit files for the services and mount the snap's squashfs image. Transactional updates are handled by snapd: if an update of a snap fails the system automatically rolls back to the previous version of the snap. Ignaz Forster research engineer at SUSE described the design of openSUSE MicroOS . It's a rolling-release distribution of openSUSE Tumbleweed developed to run as a single-purpose system. A typical target would be for hosting containers but it can even be used to create a minimal desktop . As with Fedora CoreOS and Ubuntu Core openSUSE MicroOS automatically updates itself and has a minimal package selection. In openSUSE MicroOS's case these are just RPM packages from openSUSE's repositories. There's also an enterprise version SUSE Linux Enterprise Micro and a community version based on the latter Leap Micro . OpenSUSE MicroOS has a read-only root file system using Btrfs. Transactional updates are handled by a SUSE-specific wrapper script around the package manager zypper transactional-update . This creates a new Btrfs snapshot of the root file system and then performs an update of the system. If the installation was successful the script marks the new snapshot as the default snapshot. On errors the snapshot is discarded and the previous one remains as the default. A reboot activates the new snapshot. Forster announced that since all of the read-only parts of openSUSE MicroOS have now been moved to /usr the upcoming 4.2.0 release of transactional-update would also be able to apply new snapshots without rebooting. MicroOS runs a health-checker systemd service that checks whether the system boots as expected after an update. This starts an automatic rollback to the previous default snapshot of the root file system if the system isn't healthy. The original transactional-update script has been evolving into a generic library for atomic system updates libtukit. The current implementation only supports Btrfs with openSUSE's snapshot utility Snapper as used in openSUSE MicroOS. But according to Forster the API is developed to support other backends. In contrast to Ubuntu Core all of /var and /etc are writable while /usr is read-only. For instance the default system configuration is put in /usr . Only changes made by the administrative user are in /etc . OpenSUSE's libeconf merges the configuration files placed in several locations. Most of the default MicroOS packages have been changed to work with this. Only /etc/fstab does not follow this convention yet. Forster concluded that openSUSE MicroOS takes a pragmatic approach to use existing infrastructure and packages and that it's a functional read-only OS in an imperfect world. While the previous three operating systems originated from a general-purpose mother distribution Bottlerocket is created by Amazon tailored to host containers in its Amazon Web Services (AWS) cloud. In his talk AWS software development engineer Sean McGinnis was quick to emphasize that the operating system is backed by AWS but not AWS-only. As an example the project's GitHub repository has instructions to run it on bare-metal servers . Bottlerocket was announced in March 2020 and made generally available in August 2020 . To keep its footprint as small as possible Amazon publishes variants for particular use cases. For instance there's a aws-k8s variant with containerd and kubelet to run as a Kubernetes node on AWS a vmware-k8s variant to do the same on VMware with Amazon Elastic Kubernetes Service (EKS) and a metal-k8s variant that supports Amazon EKS running on bare metal. Bottlerocket runs two completely separate container runtimes. One is running host containers for operational tasks. The other one is used for running containers with an orchestrator such as Kubernetes pods. Both runtimes have different security profiles. Each container be it a host container or a for-orchestrator container is running an API client that talks over a Unix socket to an API server running on Bottlerocket. When Bottlerocket boots its boot configuration (including user data) is loaded into the API server. User interaction is also typically done through this API to make real-time changes to the system configuration. One of the host containers is the control container which is launched on boot. This container is used to configure the Bottlerocket host. Another host container is the admin container. This isn't launched by default: it should only be launched in exceptional circumstances to troubleshoot the host operating system. It has additional privileges and can use the root process namespace to access the other containers for troubleshooting purposes. The admin container runs an SSH server that is reachable through the host's primary network interface. A final type of host container is the bootstrap container: this bootstraps the host before services like Kubernetes or Docker start. It has additional permissions for instance to provide access to the underlying host file system. Security is one of the focal points of Bottlerocket. The root file system is read-only and /etc is backed by a tmpfs file system that is regenerated on boot. For container images and volumes a separate user partition is mounted. Moreover there's no package manager no shell and no Python interpreter. If an attacker is able to escape a container there are not many tools to work with McGinnis said. To check the integrity of the block devices Bottlerocket uses dm-verity . The kernel boots in lockdown mode which prevents the root user from modifying the kernel. McGinnis explained that this increases assurance that the running kernel corresponds to the booted kernel. Another security feature he emphasized is that Bottlerocket runs with SELinux in enforcing mode. For updates Bottlerocket uses an image-based model. The kernel system packages and container runtime packages are all stored inside an operating-system image. The first block device of the host has an active and inactive partition. An upgraded image is downloaded to the inactive partition and upon reboot the host boots into this partition which is then made active. The previous Bottlerocket image is still stored in the then inactive partition and can be rolled back if required. When looking at the different approaches of these single-purpose Linux distributions it's clear that there's no one best way. Which one you choose depends on how they align to your goals and what tools you're comfortable with. Are you heavily invested in the API-first AWS or Kubernetes world? Then Bottlerocket seems to be the best fit. Do you prefer snaps to run your services? Then Ubuntu Core is a no-brainer. If you want to run containers on a host system without too much maintenance then Fedora CoreOS or openSUSE MicroOS are for you. Whether they are using rpm-ostree or Btrfs snapshots under the hood is probably less important when all of the workloads are running in containers anyway. An overview of single-purpose Linux distributions Posted Feb 15 2023 3:24 UTC (Wed) by NightMonkey (subscriber #23051) [ Link ] Bottlerocket's two container zones seems reminiscent of RancherOS. An overview of single-purpose Linux distributions Posted Feb 15 2023 12:35 UTC (Wed) by bluca (subscriber #118303) [ Link ] Shameless plug: we had a FOSDEM devroom dedicated to such distributions slides/recordings at: https://fosdem.org/2023/schedule/track/image_based_linux_... An overview of single-purpose Linux distributions Posted Feb 15 2023 13:47 UTC (Wed) by pothos (subscriber #116075) [ Link ] To add there is also Flatcar Container Linux the continuation of CoreOS Container Linux (which's parts were reused/further delevoped for Fedora CoreOS). It uses an A/B partition scheme for updating the /usr partition. Like Fedora CoreOS it provides Ignition and Afterburn for configuration. See https://www.flatcar.org/ or since I'm one of the maintainers you can also ask here if you have questions. An overview of single-purpose Linux distributions Posted Feb 21 2023 5:41 UTC (Tue) by thoeme (subscriber #2871) [ Link ] Hi Is there any activity from Debian regarding these MicroOS ? Or rather imutable OS ? Regs Thoeme An overview of single-purpose Linux distributions Posted Feb 21 2023 6:28 UTC (Tue) by pabs (subscriber #43278) [ Link ] I think some of the systemd/UAPI/UKI folks are looking at applying this sort of thing to Debian but not necessarily for images provided by Debian more for things provided outside Debian. Posted Feb 15 2023 3:24 UTC (Wed) by NightMonkey (subscriber #23051) [ Link ] Posted Feb 15 2023 12:35 UTC (Wed) by bluca (subscriber #118303) [ Link ] Posted Feb 15 2023 13:47 UTC (Wed) by pothos (subscriber #116075) [ Link ] Posted Feb 21 2023 5:41 UTC (Tue) by thoeme (subscriber #2871) [ Link ] Posted Feb 21 2023 6:28 UTC (Tue) by pabs (subscriber #43278) [ Link ] Copyright 2023 Eklektix Inc. Comments and public postings are copyrighted by their creators. Linux is a registered trademark of Linus Torvalds
987
BAD
An unorthodox scholar uses technology to expose Biblical forgeries (smithsonianmag.com) Sections History | January/February 2023 Deciphering ancient texts with modern tools Michael Langlois challenges what we know about the Dead Sea Scrolls A copy of a Greek inscription made by laying wet paper or plaster over carved stone to create a mirror-image impression. By Chanan Tigay Photographs by Franck Ferville If you spotted Michael Langlois walking along the Seine in Paris as I did one overcast morning last spring you could be forgiven for mistaking this scholar of the ancient Middle East for the bassist in Def Leppard. He wears his long brown hair in a leonine mane and when I caught up with him on the Pont des Arts he was sporting a pink sweater and salmon-colored pants. As it turns out Langlois is a professional musician having played bass on some 20 French studio albums from soul to gospel to pop. He had recently laid down the bass tracks on an album of Celtic music by the French composer Hlne Goussebayle and that summer he would perform in France with the Christian rock singer Chris Christensen. But he is also perhaps the most versatileand unorthodoxbiblical scholar of his generation. That morning he was headed to the Institut de France a learned society founded in 1795 for the cream of French intelligentsia. At 46 Langlois is one of the institutes youngest affiliates. He led me past its luminous gold-trimmed cupola and guided me through a vaulted entryway across a cobblestone courtyard and up several flights of stairs where he stopped at a room with a little sign affixed out front: Corpus Inscriptionum Semiticarum. The cramped office once served as the headquarters for a group of French scholars who beginning in the mid-19th century endeavored to publish a sweeping study of every ancient Semitic inscription then known. This article is a selection from the January/February 2023 issue of Smithsonian magazine But ancient inscriptions scratched into stone or put onto parchment or papyrus or any other surface including broken pieces of pottery known as ostraca not only offer insights into the Bibles history but also paint a picture of how people lived in biblical and even prebiblical times. The ancients used ostraca the way we use paper: to record tax payments tabulate receipts write letters and take notes on meetings. Instead of looking at the heroes of epic stories we can look at very normal people with very normal lives struggling with jobs food even their marriages kids or health Langlois said. Thats another way of reconstructing history. A professor of Old Testament studies at the University of Strasbourg in France Langlois is nearing completion of a book written with a colleague about a cache of 450 Hebrew ostraca likely dating to around 600 B.C.a time capsule of daily life in the Kingdom of Judah. For instance he decoded notes written by a soothsayer who advised a pregnant woman worrying about her babys health another woman who feared her husband was lying to her and a man who couldnt decide if he should move to a new city. But ancient inscriptions whether sacred or mundane dont always survive unblemished. To decipher them Langlois draws on an impressive range of academic training. He holds three masters degreestheology ancient Middle Eastern languages and civilization and archaeology and linguisticsand a doctorate in history and philology from the Sorbonne. But his facility with sophisticated technologies some of his own design (he briefly worked constructing simulations to chart the route of a high-speed train through a mountain tunnel) has armed him with techniques that allow him to make sense of texts so badly damaged by age climate or human folly that they are now nearly illegible. His approach which combines the close linguistic and paleographical analysis of ancient writings with advanced scientific toolsfrom multispectral imaging to artificial intelligence-assisted texture mappingcan sometimes make long-gone inscriptions come back to life. Or it can bury them for goodas in his most widely publicized feat of scholarly detective work an expos involving arguably the greatest archaeological discovery of the 20th century. The Dead Sea Scrolls first uncovered by a trio of Bedouin wandering the Judean Desert in 1947 provide a fascinating glimpse into what Scripture looked like during a transformative period of religious ferment in ancient Israel. The scrolls include the oldest copies ever found of the Hebrew Bible apocryphal texts that were never canonized and rules and guidelines for daily living written by the community of people who lived at Qumran where the first scrolls were found. All told scholars have identified as many as 100000 Dead Sea Scrolls fragments which come from more than 1000 original manuscripts. Experts date the scrolls between the third century B.C. and the first century A.D. (though Langlois believes several may be two centuries older). Some of them are relatively large: One copy of the Book of Isaiah for example is 24 feet long and contains a near-complete version of this prophetic text. Most however are much smallerinscribed with a few lines a few words a few letters. Taken together this amounts to hundreds of jigsaw puzzles whose thousands of pieces have been scattered over many different locations around the world. In 2012 Langlois joined a group of scholars working to decipher close to 40 Dead Sea Scrolls fragments in the private collection of Martin Schyen a wealthy Norwegian businessman. Each day in Kristiansand Norway he and specialists from Israel Norway and the Netherlands spent hours trying to determine which known manuscripts the fragments had come from. It was like a game for me Langlois said. The scholars would project an image of a Schyen fragment on the wall beside a photograph of a known scroll and compare them. Id say No its a different scribe. Look at that lamed Langlois recalled using the word for the Hebrew letter L. Then they would skip forward to another known manuscript. No Langlois would say. Its a different hand. Each morning while out walking the scholars discussed their work. And each day according to Esti Eshel an Israeli epigrapher also on the team They were killing another identification. Returning to France Langlois examined the fragments with computer-imaging techniques he had developed to isolate and reproduce each letter written on the fragments before beginning a detailed graphical analysis of the writing. And what he discovered was a series of flagrant oddities: A single sentence might contain styles of script from different centuries or words and letters were squeezed and distorted to fit into the available space suggesting the parchment was already fragmented when the scribe wrote on it. Langlois concluded that at least some of Schyens fragments were modern forgeries. Reluctant to break the bad news he waited a year before telling his colleagues. We became convinced that Michael Langlois was right said Torleif Elgvin the Norwegian scholar leading the effort. After further study the team ultimately determined that about half of Schyens fragments were likely forgeries. In 2017 Langlois and the other Schyen scholars published their initial findings in a journal called Dead Sea Discoveries . A few days later they presented their conclusions at a meeting in Berlin of the Society of Biblical Literature . Flashing images of the Schyen fragments on a screen Langlois described the process by which he concluded the pieces were fakes. He quoted from his contemporaneous notes on the scribes hesitant hand. He pointed out inconsistencies in the fragments script. And then he dropped the gauntlet: The Schyen fragments were only the beginning. The previous year he said hed seen photos of several Dead Sea Scrolls fragments in a book published by the Museum of the Bible in Washington D.C. a privately funded complex a few blocks from the U.S. Capitol. The museum was scheduled to open its doors in three months and a centerpiece of its collection was a set of 16 Dead Sea Scrolls fragments whose writing Langlois now said looked unmistakably like the writing on the Schyen fragments. All of the fragments published there exhibited the same scribal features he told the scholars in attendance. Im sorry to say that all of the fragments published in this volume are forgeries. This is my opinion. The weight of the evidence presented that day by several members of the Schyen team led to a re-evaluation of Dead Sea Scrolls in private collections all over the world. In 2018 Azusa Pacific University a Christian college in Southern California that had purchased five scrolls in 2009 conceded that they were likely fakes and it sued the dealer who had sold them. In 2020 the Southwestern Baptist Theological Seminary in Fort Worth Texas announced that the six Dead Sea Scrolls it had purchased around the same time were also likely fraudulent . The most stunning admission came from executives at the Museum of the Bible: They had hired an art-fraud investigator to examine the museums fragments using advanced imaging techniques and chemical and molecular analysis. In 2020 the museum announced that its prized collection of Dead Sea Scrolls was made up entirely of forgeries . Langlois told me that he derives no pleasure from such discoveries. My intention wasnt to be an expert in forgeries and I dont love catching bad guys or something he told me. But with forgeries if you dont pay attention and you think they are authentic then they become part of the data set you use to reconstruct the history of the Bible. The entire theory is then based on data that is false. Thats why ferreting out biblical fakes is paramount Langlois said. Otherwise everything we are going to do on the history of the Bible is corrupt. Langlois was raised in Voisins-le-Bretonneux a small town near Versailles in a devout Pentecostal Christian household. Before he could walk he crawled from pew to pew. But when he was 11 or so his father a telecommunications engineer brought home an old computer. Langlois brother Jean-Philippe two years his senior tracked down code for a rudimentary computer game and drafted Langlois to type the whole thingseveral thousand linesinto the machine. Thats how I learned to code he told me. Around that time Langlois read a book on numerology in the Bible and informed his Sunday-school instructor that her lecture on the theme was deeply flawed. She said Youre old enough now to attend services with the adults and showed him the door. But the more he learned about the Bible the more questions he had. If the holy book was perfect why did he keep finding it was rife with contradictions? Did God create people after he created animals as the first chapter of Genesis had it? Or did people come first as per Chapter 2? Langlois began attending Bible study armed with a notepad and pen and he would pepper his pastor with questions. I wasnt trying to undermine himI had sincere questions Langlois said. He probably thought I was a pain in the ass. It was more than a little rebellious of him to form a rock band at age 14 with his brother because the familys church had long disdained drums and electric instruments; the boys grandfather especially worried that rock music did not please God. In France high school students are required to choose a major and Langlois signed up for math and science which he went on to study as an undergraduate at Paris-Sud University. He thought he might become a math teacher or maybe a computer scientist but when he graduated he found that his faith still had a hold on him. I had questions he told me and I wanted answers. So he enrolled at the Continental Theological Seminary near Brussels where he studied theology as well as Greek and ancient Hebrew. A course on the origins of the Bible introduced him to the cultures of the ancient Middle East and the birth of the Hebrew alphabet. I was like Wow that is what I need to study. It was during this period he told me that his faith shifted. The more he learned about the history of Christianity the more he came to feel that no single denomination or doctrine had a monopoly on truth and today he feels comfortable in a variety of churches. He was working toward a graduate degree in ancient languages at the Catholic University of Paris when a professor invited him to join the group preparing a new bilingual volume of the Dead Sea Scrolls which would include the original texts alongside a new French translation. We had a meeting a dozen people and they were asking who wanted to do what Langlois said. I was raising my hand all the time. I wanted to do everything. But when they got to the Book of Enoch no ones hand went upnot even his. Enoch an apocryphal text thought to be written sometime between the third century B.C. and the second century A.D. is named for the biblical Noahs great-grandfather. One reason Langlois didnt know much about the book was that it didnt make it into the Hebrew Bible or the New Testament. Another is that the only complete copy to survive from antiquity was written in an ancient Ethiopic language called Geez. But beginning in the 1950s more than 100 fragments from 11 different parchment scrolls of the Book of Enoch written largely in Aramaic were found among the Dead Sea Scrolls. A few fragments were relatively large15 to 20 lines of textbut most were much smaller ranging in size from a piece of toast to a postage stamp. Someone had to transcribe translate and annotate all this Enochic materialand Langlois teacher volunteered him. Thats how he became one of just two students in Paris learning Geez. Langlois quickly grasped the numerous parallels between Enoch and other books of the New Testament; for instance Enoch mentions a messiah called the son of man who will preside over the Final Judgement. Indeed some scholars believe Enoch was a major influence on early Christianity and Langlois had every intention to conduct that type of historical research. He started by transcribing the text from two small Enoch fragments but age had made parts of it hard to read; some sections were missing entirely. In the past scholars had tried to reconstruct missing words and identify where in the larger text these pieces belonged. But after working out his own readings Langlois noticed the fragments seemed to come from parts of the book that were different from those specified by earlier scholars. He also wondered if their proposed readings could even fit on the fragments they purportedly came from. But how could he tell for sure? To faithfully reconstruct the text of Enoch he needed digital images of the scrollsimages that were crisper and more detailed than the printed copies inside the books he was relying on. That was how in 2004 he found himself traipsing around Paris searching for a specialized microfiche scanner to upload images to his laptop. Having done that (and lacking cash to buy Photoshop) he downloaded an open-source knockoff. First he individually outlined isolated and reproduced each letter on Fragment 1 and Fragment 2 so he could move them around his screen like alphabet refrigerator magnets to test different configurations and to create an alphabet library for systematic analysis of the script. Next he began to study the handwriting. Which stroke of a given letter was inscribed first? Did the scribe lift his pen or did he write multiple parts of a letter in a continuous gesture? Was the stroke thick or thin? Then Langlois started filling in the blanks. Using the letters hed collected he tested the reconstructions proposed by scholars over the preceding decades. Yet large holes remained in the text or words were too big to fit in the available space.The text of the Book of Enoch as it was widely known in other words was in many cases mistaken. Take the story of a group of fallen angels who descend to earth to seduce beautiful women. Using his new technique Langlois discovered that earlier scholars had gotten the names of some of the angels wrong and so had not realized the names were derived from Canaanite gods worshipped in the second millennium B.C.a clear example of the way scriptural authors integrated elements of the cultures that surrounded them into their theologies. I didnt consider myself a scholar Langlois told me. I was just a student wondering how we could benefit from these technologies. Eventually Langlois wrote a 600-page book that applied his technique to the oldest known scroll of Enoch making more than 100 improvements as he calls them to prior readings. His next book even more ambitious detailed his analysis of Dead Sea Scrolls fragments containing snippets of text from the biblical Book of Joshua. From these fragments he concluded that there must be a lost version of Joshua previously unknown to scholars and extant only in a small number of surviving fragments. Since there are thousands of authentic Dead Sea Scrolls it appears that much still remains to be learned about the origins of early biblical texts. Even the void is full of information Langlois told me. Back at the Institut de France Langlois set down a heavy bag and from a nearby shelf retrieved a black box that looked like it might hold a pair of shoes. Inside protected by balls of old crumpled newspaper were several pieces of jagged white plaster each about the size of a fist. Langlois removed one and traced his pinky along an inch-long line engraved on one sidethe ancient letter yud . These are from the Stele of Mesha he said. The Mesha Stele a three-foot-tall black basalt monument dating to nearly 3000 years ago bears a 34-line inscription in Moabite a language closely related to ancient Hebrewthe longest such engraving ever found in the area of modern-day Israel and Jordan. In 1868 an amateur archaeologist named Charles Clermont-Ganneau was serving as a translator for the French Consulate in Jerusalem when he heard about this mysterious inscribed monument lying exposed in the sands of Dhiban east of the Jordan River. No one had yet deciphered its inscription and Clermont-Ganneau dispatched three Arab emissaries to the site with special instructions. They laid wet paper over the stone and tapped it gently into the engraved letters which created a mirror-image impression of the markings on the paper whats known as a squeeze copy. But Clermont-Ganneau had misread the delicate political balance among rival Bedouin clans sending members of one tribe into the territory of anotherand with designs on a valuable relic no less. The Bedouin grew wary of their visitors intentions. Angry words turned threatening. Fearing for his life the partys leader made a break for it and was stabbed in the leg with a spear. Another man leaped into the hole where the stone lay and yanked up the wet paper copy accidentally tearing it to pieces. He shoved the torn fragments into his robe and took off on his horse finally delivering the shredded squeeze to Clermont-Ganneau. Afterward the amateur archaeologist who would become an eminent scholar and a member of the Institut de France tried to negotiate with the Bedouin to acquire the stone but his interest coupled with offers from other international bidders further irked the tribesmen; they built a bonfire around the stone and repeatedly doused it with cold water until it broke apart. Then they scattered the pieces. Clermont-Ganneau relying on the tattered squeeze did his best to transcribe and translate the steles inscription. The result had profound implications for our understanding of biblical history. The stone Clermont-Ganneau found held a victory inscription written in the name of King Mesha of Moab who ruled in the ninth century B.C. in what is now Jordan. The text describes his blood-soaked victory against the neighboring kingdom of Israel and the story it told turned out to match parts of the Hebrew Bible in particular events described in the Book of Kings. It was the first contemporaneous account of a biblical story ever discovered outside the Bible itselfevidence that at least some of the Bibles stories had actually taken place. In time Clermont-Ganneau collected 57 shards from the stele and returning to France made plaster casts of eachincluding the one Langlois now held in his handrearranging them like puzzle pieces as he worked out where each of the fragments fit. Then satisfied hed solved the puzzle he rebuilt the stele with the original pieces hed collected and a black filler that he inscribed with his transcription. But large sections of the original monument were still missing or in extremely poor condition. Thus certain mysteries about the text persist to this dayand scholars have been trying to produce an authoritative transcription ever since. The end of line 31 has proved particularly thorny. Paleographers have proposed various readings for this badly damaged verse. Part of the original inscription remains and part is Clermont-Ganneaus reconstruction. Whats visible is the letter bet then a gap about two letters long where the stone was destroyed followed by two more letters a vav and then less clearly a dalet . In 1992 Andr Lemaire Langlois mentor at the Sorbonne suggested that the verse mentioned Beit David the House of Davidan apparent reference to the Bibles most famous monarch. If the reading was correct the Mesha Stele did not just offer corroborating evidence for events described in the Book of Kings; it also provided perhaps the most compelling evidence yet for King David as a historical figure whose existence would have been recorded by none other than Israels Moabite enemies. The following year a stele uncovered in Israel also seemed to mention the House of David lending Lemaires theory further credence. Over the next decade some scholars adopted Lemaires reconstruction but not everyone was convinced. A few years ago Langlois along with a group of American biblical scholars and Lemaire visited the Louvre where the reconstructed stele has been on display for more than a century. They took dozens of high-resolution digital photographs of the monument while shining light on certain sections from a wide variety of angles a technique known as Reflectance Transformation Imaging or RTI. The Americans were working on a project about the development of the Hebrew alphabet; Langlois thought the images might allow him to weigh in on the King David controversy. But watching the photographs on a computer screen in the moments they were taken Langlois didnt see anything of note. I was not very hopeful franklyespecially regarding the Beit David line. It was so sad. I thought The stone is definitively broken and the inscription is gone. It took several weeks to process the digital images. When they arrived Langlois began playing with the light settings on his computer then layered the images on top of each other using a texture-mapping software to create a single interactive 3D imageprobably the most accurate rendering of the Mesha Stele ever made. And when he turned his attention to line 31 something tiny jumped off the screen: a small dot. Id been looking at this specific part of the stone for days the image was imprinted in my eyes he told me. If you have this mental image and then something new shows up that wasnt there before theres some kind of shockits like you dont believe what you see. In some ancient Semitic inscriptions including elsewhere on the Mesha Stele a small engraved dot signified the end of a word. So now these missing letters have to end with vav and dalet he told me naming the last two letters of the Hebrew spelling of David. Langlois reread the scholarly literature to see if anyone had written about the dotbut he said no one had. Then using the pencil on his iPad Pro to imitate the monuments script he tested every reconstruction previously proposed for line 31. Taking into account the meaning of the sentences that come before and after this line as well as traces of other letters visible on RTI renderings the group had made of Clermont-Ganneaus squeeze copy Langlois concluded that his teacher was right: The damaged line of the Mesha Stele did almost certainly refer to King David. I really tried hard to come up with another reading Langlois told me. But all of the other readings dont make any sense. In the sometimes contentious world of biblical archaeology the finding was hailed by some scholars and rejected by others. Short of locating the missing pieces of the stele miraculously intact there may be no way to definitively prove the reading one way or another. For many people though Langlois evidence was as close as we might get to resolving the debate. But that hasnt stopped him from inviting competing interpretations. Last year Matthieu Richelle an epigrapher who also studied under Lemaire wrote a paper arguing among other things that Langlois dot could just be an anomaly in the stone. He presented his findings at a biblical studies conference in a session organized by Langlois himself. This says something about how open-minded he is Richelle told me. After we left the institute Langlois and I crossed the Seine on a footbridge to reach the Louvre. The tourist shops across the street carried countless varieties of Mona Lisa trinkets and an Eiffel Tower for every occasionpainted stuffed and sculpted. But insofar as I could tell there was no Mesha Stele swag to be had. Today the pillar is kept on a pedestal in the Department of Oriental Antiquities Room 303 a cavernous hall with high ceilings beige stone walls and pleasant natural light. As Langlois approached it he immediately kneeled down and flicked on his iPhone flashlight. It looks much smaller in reality right? he said. Clermont-Ganneau had done his best but the stele looked like something out of Dr. Frankensteins laboratory. The lighter pieces were original the smooth dark areas an incongruous filler. Langlois arced his phone slowly over the inscription shining light over the words from different angles. Then he stopped over line 31. The sequence of letters is from here to here he said. So you can see the bet here at the beginning then the vav and the dalet and the dot. Together we marveled at how much seems to rest on the presence or absence of a tiny mark carved onto a stone 3000 years ago and recovered from distant sandsnothing less than evidence suggesting the existence of King David. But it was hard to make out the mark so I asked him if there was another on the stele that he could show me for comparison. He pointed to a better-preserved dot elsewhere. It looks like your dot got a little damaged I said. Its a bit damaged but with the correct anglehere he moved his light againyou can see the diameter is the same and the depth is the same. And it was true. Illuminated this way it looked like a doteffaced by water by fire by time itself. But a dot. Get the latest History stories in your inbox? Click to visit our Privacy Statement . Chanan Tigay | READ MORE Chanan Tigay is an award-winning journalist and nonfiction writer whose work has appeared in the New Yorker McSweeney's and the Atlantic among others. Franck Ferville | READ MORE Photographer Franck Ferville who trained as a musician got his start by taking portraits of pianists. Explore Subscribe Newsletters Our Partners Terms of Use 2023 Smithsonian Magazine Privacy Statement Cookie Policy Terms of Use Advertising Notice Your Privacy Rights Cookie Settings 2023 Smithsonian Magazine Privacy Statement Cookie Policy Terms of Use Advertising Notice Your Privacy Rights Cookie Settings
996
BAD
Ancient civilisation under eastern Turkey estimated to be 11k-13k years old (spectator.co.uk) Sean Thomas I am staring at about a dozen stiff eight-foot high orange-red penises carved from living bedrock and semi-enclosed in an open chamber. A strange carved head (of a man a demon a priest a God?) also hewn from the living rock gazes at the phallic totems like a primitivist gargoyle. The expression of the stone head is doleful to the point of grimacing as if he or she or it disapproves of all this: of everything being stripped naked under the heavens and revealed to the world for the first time in 130 centuries. Yes 130 centuries. Because these penises this peculiar chamber this entire perplexing place known as Karahan Tepe (pronounced Kah-rah-hann Tepp-ay) which is now emerging from the dusty Plains of Harran in eastern Turkey is astoundingly ancient. Put it another way: it is estimated to be 11-13000 years old. This number is so large it is hard to take in. For comparison the Great Pyramid at Giza is 4500 years old. Stonehenge is 5000 years old. The Cairn de Barnenez tomb-complex in Brittany perhaps the oldest standing structure in Europe could be up to 7000 years old. The oldest megalithic ritual monument in the world (until the Turkish discoveries) was always thought to be Ggantija in Malta. Thats maybe 5500 years old. So Karahan Tepe and its penis chamber and everything that inexplicably surrounds the chamber shrines cells altars megaliths audience halls et al is vastly older than anything comparable and plumbs quite unimaginable depths of time back before agriculture probably back before normal pottery right back to a time when we once thought human civilisation was simply impossible. After all hunter gatherers cavemen with flint arrowheads without regular supplies of grain without the regular meat and milk of domesticated animals do not build temple-towns with water systems. Do they? Virtually all that we can now see of Karahan Tepe has been skilfully unearthed the last two years with remarkable ease (for reasons which we will come back to later). And although there is much more to summon from the grave what it is already teaching us is mind stretching. Taken together with its age complexity sophistication and its deep resonant mysteriousness and its many sister sites now being unearthed across the Harran Plains collectively known as the Tas Tepeler or the stone hills these carved ochre-red rocks so silent brooding and watchful in the hard whirring breezes of the semi-desert constitute what might just be the greatest archaeological revelation in the history of humankind. The unveiling of Karahan Tepe and nearly all the Tas Tepeler in the last two years is not without precedent. As I take my urgent photos of the ominously louring head Necmi Karul touches my shoulder and gestures behind across the sun-burnt and undulant plains. Necmi of Istanbul University is the chief archaeologist in charge of all the local digs all the Tas Tepeler. He has invited me here to see the latest findings in this region because I was one of the first western journalists to come here many years ago and write about the origin of the Tas Tepeler. In fact under the pen-name Tom Knox I wrote an excitable thriller about the first of the stone hills a novel called The Genesis Secret which was translated into quite a few languages including Turkish. That site which I visited 16 years back was Gobekli Tepe. Necmi points into the distance now hazed with heat. Sean. You see that valley with the roads and white buildings? I can maybe make out a white-ish dot in one of the pale greeny-yellow valleys which stretch endlessly into the shimmering blur. That Necmi says Is Gobekli Tepe. 46 kilometres away. It has changed since since you were last here! These carved ochre-red rocks constitute what might just be the greatest archaeological revelation in the history of humankind And so to Gobekli Tepe. The hill of the navel. Gobekli is pivotally important. Because Karahan Tepe and the Tas Tepeler and what they might mean today cannot be understood without the primary context of Gobekli Tepe. And to comprehend that we must double back in time at least a few decades. The modern story of Gobekli Tepe begins in 1994 when a Kurdish shepherd followed his flock over the lonely infertile hillsides passing a single mulberry tree which the locals regarded as sacred. The bells hanging on his sheep tinkled in the stillness. Then he spotted something. Crouching down he brushed away the dust and exposed a large oblong stone. The man looked left and right: there were similar stone outcrops peeping from the sands. Calling his dog to heel the shepherd informed someone of his finds when he got back to the village. Maybe the stones were important. He was not wrong . The solitary Kurdish man on that summers day in 1994 had made an irreversibly profound discovery which would eventually lead to the penis pillars of Karahan Tepe and an archaeological anomaly which challenges time and again everything we know of human prehistory. A few weeks after that encounter by the mulberry tree news of the shepherds find reached museum curators in the ancient city of Sanliurfa 13km south-west of the stones. They got in touch with the German Archaeological Institute in Istanbul. And in late 1994 the German archaeologist Klaus Schmidt came to the site of Gobekli Tepe to begin his slow diligent excavations of its multiple peculiar enormous T-stones which are generally arranged in circles like the standing stones of Avebury or Stonehenge. Unlike European standing stones however the older Turkish megaliths are often intricately carved: with images of local fauna. Sometimes the stones depict cranes boars or wildfowl: creatures of the hunt. There are also plenty of leopards foxes and vultures. Occasionally these animals are depicted next to human heads. Notably lacking were detailed human representations except for a few coarse or eerie figurines and the T-stones themselves which seem to be stylised invocations of men their arms angled to protect the groin. The obsession with the penis is obvious more so now we have the benefit of hindsight provided by Karahan Tepe and the other sites. Very few representations of women have emerged from the Tas Tepeler so far; there is one obscene caricature of a woman perhaps giving birth. Whatever inspired these temple-towns it was a not a benign matriarchal culture. Quite the opposite maybe. The apparent date of Gobekli Tepe first erected in 10000 BC if not earlier caused a deal of skepticism. But over time archaeological experts began to accept the significance. Ian Hodden of Stanford University declared that: Gobekli Tepe changes everything. David Lewis-Williams the revered professor of archaeology at Witwatersrand University in Johannesburg said at the time: Gobekli Tepe is the most important archaeological site in the world. And yet in the nineties and early noughties Gobekli Tepe dodged the limelight of general public attention. Its hard to know why. Too remote? Too hard to pronounce? Too eccentric to fit with established theories of prehistory? Whatever the reason when I flew out on a whim in 2006 (inspired by two brisk minutes of footage on a TV show) even the locals in the nearby big city Sanliurfa had no conception of what was out there in the barrens. I remember asking a cab driver the day I arrived to take me to Gobekli Tepe. Hed never heard of it. Not a clue. Today that feels like asking someone in Paris if theyve heard of the Louvre and getting a Non . The driver had to consult several taxi-driving friends until one grasped where I wanted to go that German dig out of town by the Arab villages and so the driver rattled me out of Sanliurfa and into the dust until we crested one final remote hill and came upon a scene out of the opening titles of the Exorcist : archaeologists toiling away unnoticed by the world but furiously intent on their world-changing revelations. For an hour Klaus (who sadly died in 2014) generously escorted me around the site. I took photos of him and the stones and the workers this was not a hassle as there were literally no other tourists. A couple of the photos I snatched that hot afternoon went on to become mildly iconic such as my photo of the shepherd who found the site or Klaus crouching next to one of the most finely-carved T-stones. They were prized simply because no one else had bothered to take them. After the tour Klaus and I retired from the heat to his tent where over dainty tulip glasses of sweet black Turkish tea Klaus explained the significance of the site. As he put it Gobekli Tepe upends our view of human history. We always thought that agriculture came first then civilisation: farming pottery social hierarchies. But here it is reversed it seems the ritual centre came first then when enough hunter gathering people collected to worship or so I believe they realised they had to feed people. Which means farming. He waved at the surrounding hills It is no coincidence that in these same hills in the Fertile Crescent men and women first domesticated the local wild einkorn grass becoming wheat and they also first domesticated pigs cows and sheep. This is the place where Homo sapiens went from plucking the fruit from the tree to toiling and sowing the ground. Klaus had cued me up. People were already speculating that if you see the Garden of Eden mythos as an allegory of the Neolithic Revolution: i.e. our fall from the relative ease of hunter-gathering to the relative hardships of farming (and life did get harder when we first started farming as we worked longer hours and caught diseases from domesticated animals) then Gobekli Tepe and its environs is probably the place where this happened. Klaus Schmidt did not demur. He said to me quite deliberately: I believe Gobekli Tepe is a temple in Eden. Its a quote I reused to some controversy because people took Klaus literally. But he did not mean it literally. He meant it allegorically. Klaus told me more astonishing things. We have found no homes no human remains. Where is everyone did they gather for festivals then disperse? As for their religion I have no real idea perhaps Gobekli Tepe was a place of excarnation for exposing the bones of the dead to be consumed by vultures so the bodies have all gone. But I do definitely know this: some time in 8000 BC the creators of Gobekli Tepe buried their great structures under tons of rubble. They entombed it. We can speculate why. Did they feel guilt? Did they need to propitiate an angry God? Or just want to hide it? Klaus was also fairly sure on one other thing. Gobekli Tepe is unique. I left Gobekli Tepe as bewildered as I was excited. I wrote some articles and then my thriller and alongside me many other writers academics and film-makers made the sometimes dangerous pilgrimage to this sumptuously puzzling place near the troubled Turkey-Syria border and slowly its fame grew. Back here and now in 2022 Necmi myself and Aydan Aslan the director for Sanliurfa Culture and Tourism jump in a car at Karahan Tepe (Necmi promises me we shall return) and we go see Gobekli Tepe as it is today. Necmi is right: all is changed. These days Gobekli Tepe is not just a famous archaeological site it is a Unesco World-Heritage-listed tourist honeypot which can generate a million visitors a year. It is all enclosed by a futuristic hi-tech steel-and-plastic marquee (no casual wandering around taking photos of the stones and workers). Where Klaus and I once sipped tea in a flapping tent alone there is now a big visitor centre where I bump into the grandson of the shepherd who first found Gobekli. I spy the stone where I took the photo of a crouching Klaus but I see it 20 metres away. Thats as close as I can get. After lunch in Sanliurfa with its Gobekli Tepe themed restaurants and its Gobekli Tepe T-stone fridge-magnet souvenir shops Necmi shows me the gleaming museum built to house the greatest finds from the region: including a 11000 year old statue retrieved from beneath the centre of Sanliurfa itself and perhaps the worlds oldest life size carved human figure. I recall first seeing this poignant effigy under the stairs next to a fire extinguisher in Sanliurfas then titchy neglected municipal museum. Back in 2006 I wrote about Urfa man and how he should be vastly better known not hidden away in some obscure room in a museum visited by three people a year. Urfa man now has a silent hall of his own in one of Turkeys greatest archaeological galleries. More importantly we can now see that Urfa man has the same body stance of the T-shaped man-pillars at Gobekli (and in many of the Tas Tepeler): his arms are in front of him protecting his penis. His obsidian eyes still stare wistfully at the observer as lustrous as they were 11000 years ago. As we stroll about the museum Necmi points at more carvings more leopards vultures penises. From several sites archaeologists have found statues of leopards apparently mounting riding or even raping humans paws over the human eyes. Meanwhile Aslan tells me how archaeologists at Gobekli have also more recently found tantalising evidence of alcohol: huge troughs with the chemical residue of fermentation indicating mighty ritual feasts maybe. I sense we are getting closer to a momentous new interpretation of Gobekli Tepe and the Tas Tepeler. And it is very different from that perspective Klaus Schmidt gave me in 2006 (and this is no criticism of course: he could not have known what was to come). Necmi as good as promised whisks me back to Karahan Tepe and to some of the other Tas Tepeler so we can jigsaw together this epochal puzzle. As we speed around the arid slopes he explains how scientists at Karahan Tepe as well as Gobekli Tepe have now found evidence of homes. These places the Tas Tepeler were not isolated temples where hunter gatherers came a few times a year to worship at their standing stones before returning to the plains for the life of the chase. The builders lived here. They ate their roasted game here. They slept here. And they used it seems a primitive but poetic form of pottery shaped from polished stone. They possibly did elaborate manhood rituals in the Karahan Tepe penis chamber which was probably half flooded with liquids. And maybe they celebrated afterwards with boozy feasts. Yet still we have no sign at all of contemporary agriculture; they were it still appears hunter gatherers but of unnerving sophistication. Another unnerving oddity is the curious number of carvings which show people with six fingers. Is this symbolic or an actual deformity? Perhaps the mark of a strange tribe? Again there are more questions than answers. Crucially however we do now have tentative hints as to the actual religion of these people. In Gobekli Tepe several skulls have been recovered. They are deliberately defleshed and carefully pierced with holes so they could supposedly be hung and displayed. Skull cults are not unknown in ancient Anatolia. If there was such a cult in the Tas Tepeler it might explain the graven vultures pictured playing with human heads. As to how the skulls were obtained they might have come from conflict (though there is no evidence of this yet) it is quite possible the skulls were obtained via human sacrifice. At a nearby slightly younger site the Skull Building of Cayonu we know of altars drenched with human blood probably from gory sacrifice. Necmi has one more point to make about Karahan Tepe as we tour the penis chamber and its anterooms. Karahan Tepe is stupefyingly big. So far he says We have dug up maybe 1 per cent of the site and it is already impressive. I ask him how many pillars T stones might be buried here. He casually points at a rectangular rock peering above the dry grass. Thats probably another megalith right there waiting to be excavated. I reckon there are probably thousands more of them all around us. We are only at the beginning. And there could be dozens more Tas Tepeler we have not yet found spread over hundreds of kilometres. In one respect Klaus Schmidt has been proved absolutely right. After he first proposed that Gobekli Tepe was deliberately buried with rubble that is to say bizarrely entombed by its own creators a backlash of scepticism grew with some suggesting that the apparent backfill was merely the result of thousands of years of random erosion rain and rivers washing debris between the megaliths gradually hiding them. Why should any religious society bury its own cathedrals which must have taken decades to construct? And yet Karahan too was definitely and purposely buried. That is the reason Necmi and his team were able to unearth the penis pillars so quickly all they had to do was scoop away the backfill exposing the phallic pillars sculpted from living rock. I have one more question for Necmi which has been increasingly nagging at me. Did the people that build the Tas Tepeler have writing? It is almost impossible to believe that you could construct such elaborate sites in multiple places over thousands of square kilometres without careful articulate plans that is to say: without writing. You couldnt sing paint and dream your way to entire inhabited towns of shrines vaults water channels and cultic chambers. Necmi shrugs. He does not know. One of the glories of the Tas Tepeler is that they are so old no one knows . Your guess is literally as good as the experts. And yet a very good guess right now leads to the most remarkable answer of all and it is this: archaeologists in southeastern Turkey are at this moment digging up a wild grand artistically coherent implausibly strange hitherto-unknown-to-us religious civilisation which has been buried in Mesopotamia for ten thousand years. And it was all buried deliberately. Jumping in the car we head off to yet another of the Tas Tepeler but then Necmi has an abrupt change of mind as to our destination. No lets go see Sayburc. Its a little Arab village. A few months ago some of the farmers rang us and said Er we think we have megaliths in our farmyard walls. Do you want to have a look? Our cars pull up in a scruffy village square scattering sheep and hens. Sure enough there are classic Gobekli/Karahan style T-stones being used to buttress agricultural walls they are probably 11-13000 years old just like everywhere else. There are so many of them I spot one of my own on the outskirts of the village. I point it out to Necmi. He nods and says Yes thats probably another. But he wants to show me something else. Pulling back a plastic curtain we step into a kind of stone barn. Along one wall there is a spectacular stone frieze displaying animal and human figures carved or in relief. There are leopards of course and also aurochs etched in a Cubist way to make both menacing horns equally visible (you can see an identical representation of the auroch at Gobekli Tepe so similar one might wonder if they were carved by the same artist). At the centre of the frieze is a small figure in bold relief. He is clutching his penis. Next to him being threatened by the aurochs is another human. He has six fingers. For a long while we stare in silence at the carvings. I realise that a few farmers apart we are some of the first people to see this since the end of the Ice Age. Unlock a months worth of unlimited online and app access by subscribing today for free. After that its just 1 a week saving more than 50% off the usual rate. Already a subscriber? Log in Sean Thomas Topics in this article Trending Because you read about archaeology Also by Sean Thomas Latest Join the conversation with other Spectator readers. Subscribe to leave a comment. Already a subscriber? Log in
1,019
BAD
Andy Rourke: The Smiths bassist dies aged 59 (bbc.co.uk) Andy Rourke the bassist for rock band The Smiths has died aged 59 the band has announced. Guitarist Johnny Marr confirmed with deep sadness that Rourke died after a lengthy illness with pancreatic cancer. In a statement posted on Twitter Marr said: Andy will be remembered as a kind and beautiful soul by those who knew him and as a supremely gifted musician by music fans. Morrissey said Rourke will never die as long as his music is heard. Rourke played on The Smiths' most famous songs including This Charming Man and There Is a Light That Never Goes Out as well as featuring on solo singles for singer Morrissey after the group broke up in 1987. In a tribute posted on his website Morrissey said: I just hope wherever Andy has gone that he's OK. He will never die as long as his music is heard. He didn't ever know his own power and nothing that he played had been played by someone else. His distinction was so terrific and unconventional and he proved it could be done. I suppose at the end of it all we hope to feel that we were valued. Andy need not worry about that. The bassist performed on all four of The Smiths' studio albums: 1984's The Smiths 1985's Meat Is Murder 1986's The Queen Is Dead and 1987's Strangeways Here We Come. Rourke's bandmate drummer Mike Joyce tweeted: Not only the most talented bass player I've ever had the privilege to play with but the sweetest funniest lad I've ever met. Andy's left the building but his musical legacy is perpetual. I miss you so much already. Forever in my heart mate. Rourke was described as a total one-off and a rare bassist whose sound you could recognise straight away by Suede bassist Mat Osman described Rourke. I remember so clearly playing that Barbarism break over and over trying to learn the riff and marvelling at this steely funk driving the track along he recalled. The Smiths producer Stephen Street added: I am so saddened to hear this news. Andy was a superb musician and a lovely guy. I haven't been able to read any other news about details yet but I send my deepest condolences and thoughts to his friends and family. RIP. Later in his career Rourke played as part of a supergroup called Freebass with two other Mancunian bass players Gary Mani Mounfield from the Stone Roses and Peter Hook from New Order. Across his decades-long career Rourke also recorded with the Pretenders Killing Joke Sinead O'Connor Aziz Ibrahim (formerly of the Stone Roses) and former Oasis guitarist Bonehead as Moondog One - a band which also included Mike Joyce and Craig Gannon. He also played with another Manchester singer-songwriter Badly Drawn Boy joining his touring band for two years. Writing on Twitter Badly Drawn Boy said: The Smiths were easily the most important band of my teens. I was beyond honoured when Andy played bass with me on tour for two years. He was the coolest kindest funniest person - a joy to tour with. Probably the best natural musician I've ever seen. Loved him. Gutted. Allow Instagram content? This article contains content provided by Instagram. We ask for your permission before anything is loaded as they may be using cookies and other technologies. You may want to read Metas Instagram cookie policy external and privacy policy external before accepting. To view this content choose accept and continue . Rourke was born on 17 January 1964 to an English mother and an Irish father. He was interested in music from an early age and began learning the guitar aged seven. I always used to get a musical instrument either for Christmas or on my birthday so I went through plastic trumpets saxophone keyboard - I tried a bit of everything Rourke said in a 2016 interview. I played a bit of cello later on but I made that up as I went along because it was needed on a Smiths record so I just bought one tuned it up like a bass and went from there. He struck up a friendship with Marr aged 11. We were best friends going everywhere together Marr recalled. When we were 15 I moved into his house with him and his three brothers and I soon came to realise that my mate was one of those rare people that absolutely no one doesn't like. Marr and Rourke formed a band called Freak Party but did not release any music. In 1982 when Marr formed The Smiths with Morrissey and Joyce the group initially trialled two other bassists before permanently enlisting Rourke. Marr described playing the bass as Rourke's true calling. The group became the defining Manchester act of the 1980s and an icon of British alternative rock with hits including Heaven Knows I'm Miserable Now and Girlfriend in a Coma. Rourke was briefly sacked for two weeks in 1986 for taking heroin but returned to record The Queen Is Dead album later that the same year. In 1989 Rourke and Joyce took Morrissey and Marr to court arguing they were owed an equal share of earnings having only earned 10% each of the group's performance and recording royalties. Rourke settled for a lump sum of 83000 but Joyce persisted with the legal action and was awarded around 1m in backdated royalties and 25% thereafter. Rourke later filed for bankruptcy in 1999. But Marr said: We maintained our friendship over the years no matter where we were or what was happening Andy reinvented what it is to be a bass guitar player. Watching him play those dazzling basslines was an absolute privilege and genuinely something to behold. Later in his career Rourke became a radio presenter on the radio station then known as XFM. Singer Billy Bragg tweeted: Very sorry to hear that Smiths bassist Andy Rourke has passed away. I have great memories of him playing with Johnny Marr and myself on the Red Wedge tour. He was a lovely guy and an amazing bass player. My condolences to his family and friends. Presenter Terry Christian described Rourke as a lovely guy adding: Another hole left In the history of Manchester music. The Charlatans singer Tim Burgess added: Such sad sad news about Andy Rourke - he was an inspirational musician with a style that made so many of us pick up a bass guitar; and the driving force for [benefit concert] Manchester Versus Cancer. Our thoughts are with everyone who knew him. Travel well x. Music journalist and fellow bassist John Robb told BBC Radio 5 Live: In Manchester because it's such a tightly-knit musical scene and musical family it feels like you've lost someone from your family. You feel like you've lost one of your own. [Rourke's] talent you have to celebrate it. He was a kind and beautiful man. He was one of those people who's good to be in the company of. He was affable gentle kind spirit. But his basslines were so key to The Smiths. They were melodic - driving the songs along. He was the melodic spine to those songs and really made those songs. Final Smiths album was unusual drummer Joyce says The Smiths' producer shares his hit-making secrets 'Serious differences' hamper US debt ceiling talks US to support providing F-16 fighter jets to Ukraine Triumphant Assad has waited out the storm Japans 75-year pacifism hangs in balance as new threats loom Triumphant Assad has waited out the storm You think you know Satanists? Maybe you don't Awards yoga and Eurovision: Photos of the week India's grandmother-granddaughter karate champs. Video India's grandmother-granddaughter karate champs Why G7 has eight more seats at the table this year How Ukraine is tackling the mental scars of war Taiwan looms large as Japan prepares to host G7 Dancers divers and film-makers: Africa's top shots The Gen Z workers dressing to stand out How genetics can determine life choices How the 'naked' look took over fashion 2023 BBC. The BBC is not responsible for the content of external sites. Read about our approach to external linking.
1,036
BAD
Anna Karenina Principle (wikipedia.org) The Anna Karenina principle states that a deficiency in any one of a number of factors dooms an endeavor to failure. Consequently a successful endeavor (subject to this principle) is one for which every possible deficiency has been avoided. The name of the principle derives from Leo Tolstoy 's 1877 novel Anna Karenina which begins: All happy families are alike; each unhappy family is unhappy in its own way. In other words: happy families share a common set of attributes which lead to happiness while any of a variety of attributes can cause an unhappy family. This concept has been generalized to apply to several fields of study. In statistics the term Anna Karenina principle is used to describe significance tests : there are any number of ways in which a dataset may violate the null hypothesis and only one in which all the assumptions are satisfied. [ citation needed ] The Anna Karenina principle was popularized by Jared Diamond in his 1997 book Guns Germs and Steel . [1] Diamond uses this principle to illustrate why so few wild animals have been successfully domesticated throughout history as a deficiency in any one of a great number of factors can render a species undomesticable. Therefore all successfully domesticated species are not so because of a particular positive trait but because of a lack of any number of possible negative traits. In chapter 9 six groups of reasons for failed domestication of animals are defined: Ecologist Dwayne Moore describes applications of the Anna Karenina principle in ecology : Successful ecological risk assessments are all alike; every unsuccessful ecological risk assessment fails in its own way. Tolstoy posited a similar analogy in his novel Anna Karenina: Happy families are all alike; every unhappy family is unhappy in its own way. By that Tolstoy meant that for a marriage to be happy it had to succeed in several key aspects. Failure on even one of these aspects and the marriage is doomed . . . the Anna Karenina principle also applies to ecological risk assessments involving multiple stressors. [2] Much earlier Aristotle states the same principle in the Nicomachean Ethics (Book 2): [3] Again it is possible to fail in many ways (for evil belongs to the class of the unlimited as the Pythagoreans conjectured and good to that of the limited) while to succeed is possible only in one way (for which reason also one is easy and the other difficult to miss the mark easy to hit it difficult); for these reasons also then excess and defect are characteristic of vice and the mean of virtue; For men are good in but one way but bad in many. Many experiments and observations of groups of humans animals trees grassy plants stockmarket prices and changes in the banking sector proved the modified Anna Karenina principle. By studying the dynamics of correlation and variance in many systems facing external or environmental factors we can typically even before obvious symptoms of crisis appear predict when one might occur as correlation between individuals increases and at the same time variance (and volatility) goes up.... All well-adapted systems are alike all non-adapted systems experience maladaptation in their own way... But in the chaos of maladaptation there is an order. It seems paradoxically that as systems become more different they actually become more correlated within limits. [4] This effect is proved for many systems: [5] from the adaptation of healthy people to a change in climate conditions to the analysis of fatal outcomes in oncological and cardiological clinics. The same effect is found in the stock market. The applicability of these two statistical indicators of stress simultaneous increase of variance and correlations for diagnosis of social stress in large groups was examined in the prolonged stress period preceding the 2014 Ukrainian economic and political crisis . There was a simultaneous increase in the total correlation between the 19 major public fears in the Ukrainian society (by about 64%) and also in their statistical dispersion (by 29%) during the pre-crisis years. [6] Vladimir Arnold in his book Catastrophe Theory describes The Principle of Fragility of Good Things which in a sense supplements the Principle of Anna Karenina: good systems must meet simultaneously a number of requirements; therefore they are more fragile: ... for systems belonging to the singular part of the stability boundary a small change of the parameters is more likely to send the system into the unstable region than into the stable region. This is a manifestation of a general principle stating that all good things (e.g. stability) are more fragile than bad things. It seems that in good situations a number of requirements must hold simultaneously while to call a situation bad even one failure suffices. [7]
1,044
GOOD
Anna: A key-value store for any scale (muratbuffalo.blogspot.com) On distributed systems broadly defined and other curiosities. The opinions on this site are my own. This paper (ICDE'18) introduces Anna a CALM / CRDT implementation of a distributed key-value system both at the data structure level as well as system architecture and transaction protocol levels. Anna is a partitioned multi-mastered key-value system that achieves high performance and elasticity via wait-free execution and coordination-free consistency. Anna employs coordination-free actors that perform state update via merge of lattice-based composite data structures. I love the strongly opinionated introduction of this paper. This is what papers should be about: opinionated challenging conventions making bets and doing hypothesis testing in the small. Conventional wisdom says that software designed for one scale point needs to be rewritten when scaling up by 10x. Anna sets out to disprove this by showing how a key-value storage (KVS) system can be architected to scale across many orders of magnitude. (Spoiler Anna can give you only upto causal consistency but cannot provide strong consistency at key level and nothing stronger than read-committed at the multi-key level.) The high-level goal for Anna is to provide excellent performance on a single multi-core machine while also being able to scale up elastically to geo-distributed cloud deployment. In order to achieve these goals they take the following as design requirements. The codebase (including the lattice library all the consistency levels the server code and client proxy code) amounts to about 2000 lines of C++ on top of commonly-used libraries including ZeroMQ and Google Protocol Buffers. To store the private KVS replica at each actor the unordered map from the C++ standard library is used. Interactor multicast is achieved via the pub-sub communication mechanism of ZeroMQ a high-performance asynchronous messaging library. Anna uses consistent hashing to partition and replicate key-value pairs across actors. Following the Dynamo design Anna applies a CRC32 hash on the id to assign the actor to a position on the hash ring and applies the same hash function to a key in order to determine the actors responsible for storing the key. Each key-value pair is replicated N-1 times on the clockwise successor actors where N is the user-provided replication factor. Anna handles actor joining and departure in a similar fashion as Dynamo. Anna actors support three operations: GET PUT and DELETE. GET retrieves the value of a key from a (single) replica. Coordination-free consistency does not require a quorum so GET need not merge values from more than one replica. The GET response may be stale; the staleness is bounded by the multicast period which is an adjustable parameter to balance performance and staleness. PUT persists the merge of a new value of a key with a (single) replica using the lattice merge logic. DELETE is implemented as a special PUT request with an empty value field. Actors free the heap memory of a key/value pair only when the DELETE's timestamp dominates the key's current timestamp. To completely free the memory for a key each actor maintains a vector clock (associated with each key-value pair) that keeps track of the latest-heard timestamps of all actors which is kept up-to-date during multicast. Client proxies interact with actors to serve user requests. In addition to GET PUT and DELETE proxies expose two special operations to the users for consistency levels that involve transactions: BEGIN TRANSACTION and END TRANSACTION. All operations that fall in between a pair of special operations belong to a single transaction. Transaction ID is uniquely generated by concatenating a unique actor sequence number with a local timestamp. The paper performs comparison against popular KVSs designed for different scale points (Redis for single-node settings and Apache Cassandra for geo-replicated settings) and find that Anna's performance is competitive at both scales while offering a wider range of consistency levels.
1,045
BAD
Anti-Piracy Program Accused of Violating Citizens Fundamental Rights in France (torrentfreak.com) Home > Lawsuits > Since 2010 France has monitored and stored data on millions of internet users as part of anti-piracy scheme featuring warning letters fines and ISP disconnections. Europe's highest court will soon decide whether the program is permissible under EU law. Digital rights groups insist that as a general surveillance and data retention scheme it violates fundamental rights. When the French government formed a new anti-piracy agency called Hadopi the mission was to significantly disrupt BitTorrent and similar peer-to-peer file-sharing networks. Hadopi was a pioneer of the so-called graduated response scheme which consists of monitoring a file-sharers internet activities and following up with a warning notice to deter their behavior. Any future incidents attract escalating responses including fines and internet disconnections. Between 2010 and 2020 Hadopi issued 12.7 million warning notices at a cost to French taxpayers of 82 million euros . The programs effect on overall piracy rates remains up for debate but according to French internet rights groups Hadopi doesnt just take citizens money. When it monitors citizens internet activities retains huge amounts of data and then links identities to IP addresses to prevent behavior that isnt a serious crime Hadopi violates fundamental rights. Despite its authorization under the new law the official launch of the Hadopi agency in 2009 met with significant opposition. File-sharers had issues with the program for obvious reasons but for digital rights group La Quadrature du Net massive internet surveillance to protect copying rights had arrived at the expense of citizens fundamental right to privacy. La Quadratures opposition to the Hadopi anti-piracy program focuses on the law crafted to support it. One of the implementing decrees authorizes the creation of files containing internet users IP addresses plus personal identification data obtained from their internet service providers. According to the digital rights groups interpretation of EU law that is unlawful. With support from the Federation of Associative Internet Service Providers French Data Network and Franciliens.net in 2019 La Quadrature filed an appeal before the Council of State ( Conseil dtat ) requesting a repeal of the decree that authorizes the processing of personal information. The Council of State referred the matter to the Constitutional Council and its subsequent decision gave La Quadrature the impression that Hadopis position was untenable. For their part Hadopi and the government reached the opposite conclusion. The Council of State heard La Quadratures appeal and then referred questions to the Court of Justice of the European Union (CJEU) for interpretation under EU law. EU member states cannot pass national laws that allow for the general and indiscriminate retention of traffic and location data. As a preventative measure on a targeted basis retention of traffic and location data is permitted but only when the purpose of retaining the data is to fight serious crime. In CJEU Advocate General Szpunars non-binding opinion issued last October friction between privacy rights and the ability to enforce copyrights were on full display. AG Szpunar described Hadopis access to personal data corresponding to an IP address as a serious interference with fundamental rights. These data points may not be sensitive in isolation but when combined a persons identity finds itself attached to the IP address and the content that was accessed behind it. However in common with criminal cases where retention is permitted when an IP address is the only means of investigation the AG concluded that the same should apply in Hadopis case short of accepting general impunity for offenses committed exclusively online. Faced with an opinion that recognizes difficulties faced by rightsholders but runs up against case-law AG Szpunar proposed readjustment of the case-law of the Court. This would ensure that rightsholders retain the ability to enforce their rights when an IP address is the only means by which an infringer can be identified (CJEU pdf) . The first hearing in the case took place on Tuesday with another legal opinion expected late September 2023. The CJEU is expected to hand down its ruling before the end of the year. Anti-Piracy Anti-Piracy Lawsuits Submit a correction or tip. Breaking news stories Social media followers RSS subscribers Copyright Privacy Policy VPN Providers About TorrentFreak
1,058
BAD
Anti-royal protesters are being arrested in the U.K. as 'Not My King' tag grows (npr.org) Text-Only Version Go To Full Site NPR > Europe By Bill Chappell Monday September 12 2022 1:16 PM EDT Not my king. He is and there is nothing you can do about it. That exchange between two commenters on Twitter encapsulates the current friction in the United Kingdom where King Charles III's ascension to the throne has sparked anti-royalist protests. In dramatic ways the dynamic is playing out in real life: police have made a series of arrests against people protesting the monarchy in recent days raising questions about the right to freedom of speech as the U.K. proclaims a new head of state to replace the late Queen Elizabeth II. Police in Scotland allege breaches of peace In widely seen photos a woman holding a sign reading Abolish monarchy and F*** imperialism was arrested on Sunday at St. Giles' Cathedral in Edinburgh where the queen's body is to lie at rest until Tuesday. Let her go! It's free speech! one man shouted according to The Scotsman newspaper. Others yelled back Have some respect. The woman 22 was arrested in connection with a breach of the peace a Police Scotland spokesperson told NPR adding that she was formally charged and released and her case is now pending at Edinburgh Sheriff Court. Similar reports emerged Monday including one case involving a man who was seen being forcibly removed from a parade barrier after shouting at the royal procession leading to the cathedral. He was reportedly heckling Prince Andrew. And in London a woman was led away by four uniformed officers on Monday after holding up a sign reading Not my king which has become a trending hashtag near Westminster Hall. Protester says he didn't disrespect the late queen The most high-profile case comes from Oxford where author and activist Symon Hill says he was arrested for protesting Charles' ascension in what he called an outrageous assault on democracy. Hill said he objected only to Charles being proclaimed king and that he didn't do anything to disrespect Elizabeth or disrupt those mourning her. It was only when they declared Charles to be 'King Charles III' that I called out 'Who elected him?' Hill said as he related his experience. I doubt most of the people in the crowd even heard me. Two or three people near me told me to shut up. The U.K. adopted new laws on protests this year Hill said police initially told him he was arrested under the U.K.'s recently toughened laws on protests a change that came after advocacy groups such as Extinction Rebellion and Black Lives Matter mounted large and/or disruptive protests in recent years. The new law allows police to act in cases which they deem to be unjustifiably noisy protests that may have a significant impact on others or seriously disrupt an organization's activities. The arrests highlight individuals' troubles with the authorities but large groups have also chosen or been forced to change their plans due to Elizabeth's death. Extinction Rebellion had for months planned to hold a large protest in London this past weekend dubbed the Festival of Resistance. But after the queen died the group postponed its plan saying that occupying a Royal Park at this time would not be practical. Groups such as Republic in the U.K. to be a Republican means to support the end of monarchy are amplifying calls for change sharing comments from people who are comparing the crackdown on protests in the U.K. to the actions of authoritarian regimes. Republic said on Monday that it's contacting police agencies to ask about the rules for demonstrators adding that it intends to organize protests around Charles' coronation and will expect those protests to be allowed to go ahead peacefully. Topics NPR
1,062
BAD
Apollo will close down on June 30th https://old.reddit.com/r/apolloapp/comments/144f6xm/apollo_will_close_down_on_june_30th_reddits/ timf use the following search parameters to narrow your results: e.g. subreddit:aww site:imgur.com dog see the search faq for details. advanced search: by author subreddit... 15006 users here now All things related to Apollo a Reddit app for iPhone with a focus on a clean iOS centric UI with super comfortable and fast browsing. Feel free to post if you have any questions or suggestions! Subscribe and join our delightful community! You can also follow Apollo on Twitter: @apolloreddit If you're interested in supporting Apollo's future buying Apollo Pro in the app helps a ton as well as any donations in the app's Tip Jar! (Thank you awesome person!) If you have BTC ETH or any other cool crypto abbreviations others have mentioned they'd like to donate cryptocurrency and if so here are some addresses: (Thank you!) BTC: 1LamDwEtMYzeMF8F4evLVshgL6UaM1mU7e ETH: 0x53FDee21dCA9555Cd6d28089c8eA7E118d7189B6 Litecoin: LQgNag4cmGTbedXQvahpi6eQydpSjgBWv6 Nano: xrb_1j7ppgbx47bbyk1ohbuy9nqghreo8pct8z1gcghyqj7kz7tmz9xnugjntx9k (I add these based on request if there's a different cryptocurrency you'd like to see added please PM me and I happily will!) the front page of the internet. and join one of thousands of communities. Apollo will close down on June 30th. Reddits recent decisions and actions have unfortunately made it impossible for Apollo to continue. Thank you so so much for all the support over the years. Announcement ( self.apolloapp ) submitted 11 hours ago * by iamthatis Apollo Developer [ M ] 22 150 5 5 10 2 10 59 4 5 19 7 5 3 8 2 2 29 25 7 2 29 5 19 5 2 3 34 4 17 2 20 5 7 3 6 14 4 5 7 11 8 31 5 18 19 8 2 2 8 27 17 7 19 4 66 2 2 29 6 13 234 21 487 5 5 14 19 4 2 & 1504 more - announcement Hey all It's been an amazing run thanks to all of you. Eight years ago I posted in the Apple subreddit about a Reddit app I was looking for beta testers for and my life completely changed that day. I just finished university and an internship at Apple and wanted to build a Reddit client of my own: a premier customizable well-designed Reddit app for iPhone. This fortunately resonated with people immediately and it's been my full time job ever since. Today's a much sadder post than that initial one eight years ago. June 30th will be Apollo's last day. I've talked to a lot of people and come to terms with this over the last weeks as talks with Reddit have deteriorated to an ugly point and in the interest of transparency with the community I wanted to talk about how I arrived at this decision and if you have any questions at the end I'm more than happy to answer. This post will be long as I have a lot of topics to cover. Please note that I recorded all my calls with Reddit so my statements are not based on memory but the recorded statements by Reddit over the course of the year. One-party consent recording is legal in my country of Canada. Also I won't be naming names that's not important and I don't want to doxx people. On April 18th Reddit announced changes that would be coming to the API namely that the API is moving to a paid model for third-party apps. Shortly thereafter we received phone calls however the price (the key element in an announcement to move to a paid API) was notably missing with the intent to follow up with it in 2-4 weeks. The information they did provide however was: we will be moving to a paid API as it's not tenable for Reddit to pay for third-party apps indefinitely (understandable agreed) so they're looking to do equitable pricing based in reality. They mentioned that they were not looking to be like Twitter which has API pricing so high it was publicly ridiculed . I was excited to hear these statements as I agree that long-term Reddit footing the bill for third-party apps is not tenable and with a paid arrangement there's a great possibility for developing a more concrete relationship with Reddit with better API support for users. I think this optimism came across in my first post about the calls with Reddit. Six weeks later they called to discuss pricing. I quickly put together a small app where I could input the prices and it would output monthly/yearly cost cost for free users paid users etc. so I'd be able to process the information immediately. The price they gave was $0.24 for 1000 API calls. I quickly inputted this in my app and saw that it was not far off Twitter's outstandingly high API prices at $12000 and with my current usage would cost almost $2 million dollars per month or over $20 million per year. That is not an exaggeration that is just multiplying the 7 billion requests Apollo made last month by the price per request. Could I potentially get that number down? Absolutely given some time but it's illustrative of the large cost that Apollo would be charged. Reddit's promise was that the pricing would be equitable and based in reality. The reality that they themselves have posted data about over the years is as follows (copy-pasted from my previous post ): Less than 2 years ago they said they crossed $100M in quarterly revenue for the first time ever if we assume despite the economic downturn that they've managed to do that every single quarter now and for your best quarter you've doubled it to $200M. Let's also be generous and go far far above industry estimates and say you made another $50M in Reddit Premium subscriptions. That's $550M in revenue per year let's say an even $600M. In 2019 they said they hit 430 million monthly active users and to also be generous let's say they haven't added a single active user since then (if we do revenue-per-user calculations the more users the less revenue each user would contribute). So at generous estimates of $600M and 430M monthly active users that's $1.40 per user per year or $0.12 monthly. These own numbers they've given are also seemingly inline with industry estimates as well . Apollo's price would be approximately $2.50 per month per user with Reddit's indicated cost being approximately $0.12 per their own numbers. A 20x increase does not seem based in reality to me. This was a very common comment across the topics: If Apollo has an apparent opportunity cost of $20 million per year why not just buy them and other third-party apps as they did with Alien Blue? I believe it's a fair question. If these apps apparently cost so much an easy solution that would likely make everyone happy would be to simply buy these apps out. So I brought that up to them during a call on May 31st where I was suggesting a variety of potential solutions. About 24 hours after that call with Reddit I received this odd message on Mastodon : Can you please comment publicly about the internal Reddit claim that you tried to blackmail them for a $10000000 payout to stay quiet? Then yesterday moderators told me they were on a call with CEO Steve Huffman (spez) and he said the following per their transcript: Steve: Apollo threatened us said theyll make it easy if Reddit gave them $10 million. Steve: This guy behind the scenes is coercing us. He's threatening us. Wow. Because my memory is that you didn't take it as a threat and you even apologized profusely when you admitted you misheard it. It's very easy to take a single line and make it look bad by removing all the rest of the context so let's look at the full context. I can only assume you didn't realize I was recording the call because there's no way you'd be so blatantly lying if you did. As said a common suggestion across the many threads on this topic was If third-party apps are costing Reddit so much money why don't they just buy them out like they did Alien Blue? That was the point I brought up. If running Apollo as it stands now would cost you $20 million yearly as you quote I suggested you cut a check to me to end Apollo. I said I'd even do it for half that or six months worth: $10 million what a deal! The bizarre thing is - initially - on the call you interpreted that as a threat. Even giving you the benefit of the doubt that maybe my phrasing was confusing I asked for you to elaborate on how you found what I said to be a threat because I was incredibly confused how you interpreted it that way. You responded that I said Hey if you want this to go away Which is not at all what I said so I reiterated that I said If you want to Apollo to go quiet as in it's quite loud in terms of API usage. What did you then say? Me: I said 'If you want Apollo to go quiet'. Like in terms of- I would say it's quite loud in terms of its API usage. Reddit: Oh. Go quiet as in that. Okay got it. Got it. Sorry. Reddit: That's a complete misinterpretation on my end. I apologize. I apologize immediately. The admission that you mistook me and the four subsequent apologies led me to believe that you acknowledged you mistook me and you were apologetic. The fact that you're pretending none of this happened (or was recorded) and instead espousing a different reality where instead of apologizing for taking it as a threat you're instead going the complete opposite direction and saying He threatened us! is so low I almost don't believe it. But again I've recorded all my calls with you just in case you tried something like this. Transcript of this part of the call: https://gist.github.com/christianselig/fda7e8bc5a25aec9824f915e6a5c7014 Audio of this part of the call: http://christianselig.com/apollo-end/reddit-third-call-may-31-end.m4a (If you take issue with the call being recorded please remember that I'm in Canada and so long as one participant in the call (me) consents to being recorded it's legal. If anyone would like the recording of the full call I'm happy to provide.) I bring this up for two reasons: Some people are confused about this situation and don't understand what an API is. An API (Application Programming Interface) is just a way for an app to talk to a website. As an analogy pretend Reddit is a bouncer. Historically you can ask Reddit Could I have the comments for this post? or Can you list the posts in AskReddit?. Those would be one API request each and Reddit would respond with the corresponding data. Everything you do on Reddit is an API request. Upvoting downvoting commenting loading posts loading subreddits checking for new messages blocking users filtering subreddits etc. The situation is changing so that for each API request you make there's a portion of a penny charged to the developer of that app. I think that is very reasonable provided well that the price they charge is reasonable. Another common claim by Reddit is that Apollo is inherently inefficient using on average 345 requests per day per user while some other apps use 100. I'd like to use some numbers to illustrate why I think this is very unfairly framing it. Up until a week ago the stated Reddit API rate limits that apps were asked to operate within was 60 requests per minute per user. That works out to a total of 86400 per day. Reddit stated that Apollo uses 345 requests per user per day on average which is also in line with my findings. Thats 0.4% of the limit Reddit was previously imposing which I would say is quite efficient. As an analogy (can you tell I love analogies?) to scale the numbers if I was to borrow my friends car and he said Please dont drive it more than 864 miles and I returned the car with 3.4 miles driven I think hed be pretty happy with my low use. The fact that a different friend one week only used 1 mile is really cool but I don't think either person is inefficient. That being said if Reddit would like to see Apollo make further optimizations to get its existing number lower Im genuinely more than happy to do so! However the 30 day limit theyve given me after announcing the pricing to when I will start getting charged significant amounts of money is not enough time to deal with rewriting large parts of my app to lower total requests while also changing the payment model transitioning users and ensuring this is all properly tested and gets through app review. Further Reddit themselves said to me that the majority of the cost isn't the server it's the opportunity cost per user so the focus on 100 versus 345 calls rather than the cost per user doesn't sound genuine. At the very least providing even a bit more time to lower usage to their new targets would be feasible if they've historically provided it and it's not the majority of the costs anyway. Me: Because I assume the majority of it isn't server costs. I assume the majority is the opportunity cost per user. Reddit: Exactly. One option many have suggested is to simply increase the price of Apollo to offset costs. The issue here is that Apollo has approximately 50000 yearly subscribers at the moment. On average they paid $10/year many months ago a price I chose based on operating costs I had at the time (server fees icon design having a part-time server engineer). Those users are owed service as they already prepaid for a year but starting July 1st will (in the best case scenario) cost an additional $1/month each in Reddit fees. That's $50000 in sudden monthly fee that will start incurring in 30 days. So you see even if I increase the price for new subscribers I still have those many users to contend with. If I wait until their subscription expires slowly month after month there will be less of them. First month $50000 second month maybe $45000 then $40000 etc. until everything has expired amounting to hundreds of thousands of dollars. It would be cheaper to simply refund users. I hope you can recognize how that's an enormous amount of money to suddenly start incurring with 30 days notice. Even if I added 12000 new subscribers at $5/month (an enormous feat given the short notice) after Apple's fees that would just be enough to break even . Going from a free API for 8 years to suddenly incurring massive costs is not something I can feasibly make work with only 30 days. That's a lot of users to migrate plans to create things to test and to get through app review and it's just not economically feasible. It's much cheaper for me to simply shut down. Hopefully that illustrates why even more than the large price associated with the API the 30 day timeline between when the pricing was announced and developers will be charged is a far far far bigger issue and not one I can overcome. Much more time would be needed to overhaul the payment model in my app transition existing users from existing plans test the changes and have users update to the new version. As a comparison when Apple bought Dark Sky and announced a shut down of their API knowing that this API was at the core of many businesses they provided 18 months before the API would be turned off . When the 18 months came they ultimately extended it another 12 months resulting in a total transition period of 30 months. While I'm not asking for that much Reddit's in comparison is 30 days . The issue is the size of the bill not when it will arrive. Significant significant charges for the API will start building up with 30 days notice on July 1st the fact that the bill for those charges being 30 days from then is not important. If you hear that your electricity bill is going up 1000x and the company tells you Don't worry the bill only comes at the end of the month I hope you understand how that isn't comforting. I hope I explained above why the 30 day time limit is the true issue. However in a perfect world I think lowering the price by half and providing a three month transition period to the paid API would make the transition feasible for more developers myself included. These concessions seem minor and reasonable in the face of the changes. That was my understanding as well based on what they said on a call on May 4th: Reddit: If there's an entity who's like 'Hey I'm showing really good progress' you know trying to like we're trying to get a contract in place we're trying to do all that type of stuff I don't think you're going to see us be like you know like overly aggressive on that timeline. And I feel pretty confident about that point by the way based on conversations I've heard internally. However when asking about more time such as a 90 day transition period to make the changes they said: Reddit: On the 90-day transition remember that billing doesn't kick in until July 1. So you won't see your first bill from July until the beginning of August and it wont be due until the end of August (Its net 30 day billing). You do however have to sign an agreement to get paid level access on July 1. Yes my last email to them (including Steve) said: In terms of timeline what concerns me most is the short nature of it before I start incurring costs. I have a large amount of users at price points that I wont be able to afford to support with 30 days notice. For instance users who subscribed for a year for $10 six months ago when I had no idea any of this was coming amounts to $0.83 per month or $0.58 after Apples cut. Even if Im able to decrease my API usage down to the number in your charts that still puts me in the red for everyone of those users for awhile with no recourse. A situation like this is one that is legitimately making me legitimately leaning toward shutting down the app but one that I could salvage if given more time to transition from the free API to the paid API. In prior calls you mentioned that provided I kept communicating and progress was being made the timeline wasnt an absolute. Is that still the case or is it now the case that the date is set in stone? That was a week ago and I've yet to receive any further contact from Reddit. To a certain extent yes. However I was assured this year by Reddit not even that long ago that no changes were planned to be made to the API Apollo uses and I've made decisions about how to monetize my business based on what Reddit has said. January 26 2023 Reddit: So I would expect no change certainly not in the short to medium term. And we're talking like order of years. Another portion of the call: January 26 2023 Reddit: There's not gonna be any change on it. There's no plans to there's no plans to touch it right now in 2023. Me: Fair enough. Reddit: And if we do touch it we're going to be improving it in some way. I've received so many messages of kind people offering to work with me to build a competitor to Reddit and while I'm very flattered that's not something I'm interested in doing. I'm a product guy I like building fun apps for people to use and I'm just not personally interested in something more managerial. These last several months have also been incredibly exhausting and mentally draining I don't have it in me to engage in something so enormous. Probably not. Maybe if the perfect buyer came along who thought they could turn Apollo into something cool and sustainable but I'd rather the app just die if it would go to a company that would turn something I worked really hard on into something that would ruin its legacy. To be clear: I am not threatening anyone in the previous paragraph. Reddit stated on the first call that they don't want to be like Twitter: Reddit: I think one thing that we have tried to be very very very intentional about is we are not Elon we're not trying to be that we're not trying to go down that same path. [...] We are trying to do is just use usage-based pricing that will hopefully be very transparent to you and very clear to you. Or we're not trying to go down the same path that you may have seen some of our other peers go down. They now state that the comparison of how close their pricing comes to Twitter is an unfair one and that when they said that above they were apparently referring not to the pricing but to the decision Twitter made to ban third-party apps at a rule level not a pricing level. I think regardless of whatever their intent/meaning behind the comparison to Twitter was the result is the same: the pricing will kill third-party apps just as Twitter did. I said this to Reddit and they responded that they don't think Twitter's pricing is unreasonable and that if anything if Twitter reversed the rule about third-party apps they would probably increase the prices as well. Just to be clear about how wrong and out of touch that is without naming names a formerly very very high up person at Twitter messaged me on Twitter and said: The Reddit api moves are crazy. Im not sure what choices you have but to move to another network. [...] That pricing is designed to prevent apps like yours forevermore. So to be clear even this person thinks this pricing is unreasonable. I do too. I requested a call to talk to Steve about some suggestions I had his response was Sorry no. You can give name-redacted a ping if you want. I've then emailed that person (same person I've been talking to for months) suggestions approximately one week ago about how Apollo could survive this and I've yet to receive a response. Abundantly. Unlike other social media companies like Facebook and Twitter who pay their moderators as employees Reddit relies on volunteers to do the hard work for free. I completely understand that when tools they take to do their volunteer important job are taken away there is anger and frustration there. While I haven't personally mobilized anyone to participate in the blackout out of fear of retaliation from Reddit the last thing I want is for that to feel like I don't support the folks speaking up. I wholeheartedly do. It's been a horrible week and the kindness Redditors and moderators and communities have shown Apollo and other third-party apps has genuinely made it much more bearable and I am genuinely so appreciative. I am admittedly doubtful Reddit wants to listen to folks anymore so I don't see it having an effect. In hindsight kinda yeah. Many of the other developers and folks I talked to were much less optimistic than I was but I legitimately had great interactions with Reddit for many years prior to last week (they were kind communicative gave me heads up of changes) so when they said they were aiming to have pricing that would be fair and based in reality I honestly believed them. That was foolish of me in hindsight and maybe could have had a different outcome if I was more aggressive in the beginning. Sorry. /canadian (And to be clear they did indeed say this. They used the word substantive and I wanted to make sure we had the same definition of something having a firm basis in reality and therefore important meaningful or considerable) Reddit: That's exactly right. And I think thankfully the word is exactly the right one. It's going to have a firm basis in reality. I also just looked it up. We're going to try to be as transparent as we can. On May 31st Reddit posted a chart of large excess usage by some unlabeled API clients and stated: We reached out to the most impactful large scale applications in order to work out terms for access above our default rate limits via an enterprise tier. To be clear Apollo was never contacted and I've been told from someone internally that Apollo is indeed not one of the unlabeled API clients. The only time that Apollo was reached out to by Reddit in any capacity about usage was late last year when we received an email about a 6 minute period where Apollo's server API usage increased by 35% before lowering again. Despite 35% for 6 minutes being a comparatively small blip (the above post references clients that are over by 500000%) we responded within 2 minutes. We offered to jump on a call with Reddit engineers if they needed an answer ASAP identified the issue within several hours and Reddit thanked us for the fast investigation. Full email transcript: https://gist.github.com/christianselig/6c71608cf617d2f881cd2849325494c1 On the call with moderators Steve Huffman said: Steve: I don't use the app so I'll give you the best answer I can -- he does scraping so that he can deliver notifications faster but has done NO EFFORT to be a good citizen of the internet. First off Apollo does no scraping it's purely through authenticated calls to the API and has checks in place to ensure it stays within Reddit's API rate limits. I've open sourced the server code to show this . Secondly to say we have made no effort is categorically false. I have so many emails where I've reached out to Reddit expressing concerns about and bugs inefficiencies in the API or ideas on how to improve things or significant Reddit bugs that made things hard on us. When Reddit has had questions for us as discussed above we immediately jumped into action to get an answer as quickly as possible. Here's an email of me giving a heads up to Reddit of IP address changes on our server: Me: With the new change it'll be maybe like one IP address. This is all obviously still within the API rate limits as the requests are from individual user accounts that have signed in. Again long story short the result will be more optimized if anything I just wanted to give a heads up and ensure that it'd be okay if Reddit suddenly saw the server go from a bunch of different IP addresses to a single one which might cause some confusion if I didn't give a heads up. Me wanting to make sure we were doing everything as best as we could: Me: Everything is going well we just had a few questions about best practices making sure were following any suggestions your team has. Is there any way we could poke someone on your team with a few questions weve been having and have a tiny back and forth? We were just seeing some elevated response times and just thought it would be great if we could maybe describe what were doing and see if anything seems off/suboptimal. Me reporting to Reddit that the API has a serious bug in recording rate limits: Me: We obviously respect the rate limit headers and if a user comes close to approaching it (within 50 requests of the 600 every 10 minutes limit) we stop their requests until the refresh period occurs. However we're seeing some users have very very weird rate limit headers. Things like requests remaining: 0 requests made: 17483 reset: 598 seconds left which indicates they've somehow made over 17 thousand requests in two seconds which seems hard to believe. Me suggesting to Reddit improvements that could help improve efficiency of notification API calls: Me: So like little stuff like that where even if there's a streaming client or some way to minimize the calls there I think it would help us both out enormously. Further when making suggestions to your own employees they themselves have expressed concern about how terrible the public API is: Call on January 26 2023 Reddit: I cannot tell you how painful it is to use our API. [...] The API needs to change. Like it's just unusable. I am surprised that you're able to build a functional app on it to be honest. Steve: Why not work with the third party apps? Their existence is not a priority for us. We don't use them. I don't use them. It's a part of our traffic but not a lot and it's a lot of work on our side to keep them alive. If I have to choose where to put our effort we're going to focus internally. I'm kind of open to it but I haven't and I can't convince you but I don't get the sense that they want to work with us either. I'm genuinely not sure where Steve has got the impression that I don't want to work with him. Despite reaching out multiple times and him declining to talk I've stated multiple times on calls literally saying the words I definitely still want to talk. Reddit: What I'm hearing is like Yeah great. We have this disagreement on pricing methodology etc. But any feasible number that we get to any number that's even in the zip code of what we're sharing with you is unfeasible from your perspective financially. So it's like arguing around the edges of that price thing is like it just won't make any sense to you. And I presume also just given the NSFW stuff and the removal of ads that makes it even more trickier.Me: Yeah. I mean to be very clear I'm not saying I'm walking away from the negotiation table and taking my basketball and going home and just gonna kick up a storm. That's not my intention at all. I definitely still want to talk. I'm not asking you to lower the price by a hundred times or something. I don't think depending on what you mean by zip code I don't think I'm so unreasonable that I'm requiring you to bend over backwards here. I've also emailed Steve and the other contact directly stating that I'm interested in talking and including ideas for how we could come to a solution: Me: I understand where Reddit's coming from in this. A free API while appreciated is not tenable for you especially heading into an IPO and my only goal here is to come to a solution where we both feel understood. I also hear you that killing third-party clients isn't actually the goal and in that spirit have been working on how to address your concerns from my end: [...] I don't know how you can say I'm not interested in talking when you haven't my most recent email in a week. To say it once more I was very interested in talking. On the other side of things per the transcript Steve and the other admin on the call don't even know when the discussions with third-party apps began. Steve: When did we start talking with them? AnAbsurdlyAngryGoose: What month did you first start? Steve: FlyingLaserTurtles? Do you remember? April or May of this year. FlyingLaserTurtles: Maybe late March? But yes. Steve: We've been in contact with third party apps for MONTHS talking about these coming changes. When you announce that the API will be charging developers the most important portion of that conversation is what will be charged which was not available for almost two months after the initial call. From the time developers were told the price to the time developers will be subject to the price is 30 days not months. Months would have been very helpful in fact. I've been talking to my rep at Apple and over the next few weeks my plan is to release something similar to what Tweetbot did (Paul has been incredibly helpful in all of this) where folks can decide if they want a pro-rated refund on any existing time left in their subscription as Apollo will not be able to afford to continue it or they can decline the refund if they're feeling kind and have enjoyed their time with Apollo. For the curious refunding all existing subscriptions by my estimates will cost me about $250000. Apollo got mentioned a few times during Apple's 2023 WWDC keynote even by Craig Federighi himself and even during the Vision Pro announcement showing Apollo as one of the existing apps compatible with the headset (I'm sorry I won't be able to see that happen). I was lucky enough to be there in person and it felt incredible. Some folks asked if there was any deeper meaning behind that and while that would be cool in all reality these things are so well produced that they've been done for a while now so I'm sure it's just a coincidence even if it's a really cool one. A funny amount of people have reached out wondering about all the extra monthly icons I had queued up for Apollo. I love them was so excited for them and I'll make them available immediately for the short time left but if you're curious here's a screenshot of all of them: https://christianselig.com/apollo-end/remaining-icons.png We ended up with well over 100 custom icons created by incredibly talented designers and I'm really sorry to those designers who didn't get to see their work launched in the app (to be clear don't worry I paid them all there isn't some bs exposure agreement but it's fun to have your icon launch and I feel bad!) In order to avoid incurring charges I will delete Apollo's API token on the evening of June 30th PST. Until that point Apollo should continue to operate as it has but after that date attempts to connect to the Reddit API will fail. I will put up an explainer in the app prior to that which will go live at that date. I will also provide a tool to export any local data you have in Apollo such as filters or favorites. I want to thank a lot of people who have made this last week bearable. First and foremost the communities Redditors and moderators who have reached out in support of third-party apps making Reddit's gaslighting a lot more bearable in making me feel like at least someone was understanding me and in my corner. My girlfriend's been absolutely incredible and supportive. This year was our 10th anniversary and Monday was her 30th birthday. We're down in California for Apple's WWDC and had a bunch of things planned to do for her birthday afterward and I feel terrible that we're flying home early to deal with all of this instead of making her 30th special. I'll make it up to her. Andr Medeiros worked on the Apollo server component with me for the last two years and it's been an absolute joy to work with a professional who knows so much on that side of things. The iOS developer community has been unbelievably kind to me over the past several weeks I've spent the last week with many of them even staying at an Airbnb with a bunch of them (they ordered me pizza as I wrote this post!) and I've got so many hugs and condolences haha. Specifically want to thank Paul Haddad of Tweetbot/Tapbots/Ivory Ryan Jones Brian Mueller Curtis Herbert Andr Medeiros Quinn Nelson Paul Hudson Majd Taby Ryan McLeod Phill Ryu Larry Hryb Charlie Chapman Mustafa Yusuf Adrian Eves Devin Davies Jordan Morgan Yariv Nassim Will Sigmon Barry Hershman Joe Rossignol Michael Simmons Joe Fabisevich my family and so so many more. Also want to thank everyone at Apple who have gone out of their way to be incredibly kind here (I don't know if I'm allowed to name names but you know who you are). No bullshit I'll be fine. Through pure chance last year I spun off my silly Pixel Pals idea into a separate app and that actually makes good revenue on the side. I also have savings. Recently (like last week) my city had its worst wildfires in history with over 100 homes destroyed. That's brutal losing an app is sad but it's been helpful to me to recognize how much worse it could be just literally down the street from me. Honestly. Apollo had an incredible run I met the coolest people by my last count talked with folks over 15000 times in our subreddit about Apollo and raised over $80000 for my local animal shelter through Apollo. I feel incredibly fortunate. I think I'll rewatch Ted Lasso though. I build a second app called Pixel Pals that I spun off from Apollo that's thankfully done pretty well and I'll be spending more time on going forward. If you like the idea of digital pets it's a really fun app to check out. https://pixelpa.ls If any media/press folks have any questions please shoot me an email rather than messaging me on Reddit I missed a few last week because my inbox was blowing up. My email is me@christianselig.com I think I covered everything but if there's any questions feel free to ask and I'll do my best to answer! In the event that this post is taken down or you want to link somewhere else it's also available at https://apolloapp.io Thanks for everything over these last 8 years - Christian EDIT: Few updates: Per many requests I also added back the Tip Jar to the top of settings if you update the app. It's incredibly kind of anyone to even think of that but please feel no pressure. On one hand I don't want it to feel like I'm profiteering off this event but on the other hand I imagine people understand it would have been much more profitable/ideal if the app were able to just continue to exist in the first place so that would be really bad profiteering and the refund thing genuinely is daunting. I've seen a lot of questions along the lines of: What if Reddit gives you a deadline extension because of this post and posts by other developers? and that's something I truly would have loved for them to have made an effort to communicate earlier. You can't give developers 30 days between when the pricing is announced and when they will start incurring charges and also wait a week (25% of the time we're given) between replying to emails without so much as a we hear you're concerned about the short timeline and looking into what we can do. In conjunction with your previous emails it just appears like you've stopped any desire to communicate with developers in a period where we have a serious expensive deadline looming with not that much time to wind down our apps. And I also just know if I sent another email saying I'm going to post tomorrow that Apollo is shutting down unless you do something about the timeline it would be construed as a threat. Even more than that Reddit's behavior has been so appalling that for any developer I've talked to it's completely erased the indication that they even want us around. Post a comment! [] Mathesar 818 points 819 points 820 points 11 hours ago (43 children) Oh lawdy there are tapes. [] geckospots 340 points 341 points 342 points 11 hours ago (2 children) Hes got the receipts. [] Smooth-Question-3069 71 points 72 points 73 points 10 hours ago (7 children) Yeah absolutely legendary move getting these bozos on tape. Insane spin by Reddit. Not that I expected anything less but this is a disaster this valuation is cratering. [] lman2121 3723 points 3724 points 3725 points 11 hours ago (330 children) Craigs widgets :( [] glenn1812 2063 points 2064 points 2065 points 11 hours ago (269 children) Wish apple could intevine somehow. One of the best apps on iOS. God damn I can't believe that in a few months I won't be browsing reddit at the end of the day ever again [] BlesserBot 993 points 994 points 995 points 11 hours ago (94 children) Apple won't be able to do anything :( It's REDDIT who with their outlandish api pricing brought us to this stage. Fuck this I am out. [] fs454 816 points 817 points 818 points 10 hours ago (107 children) /u/spez I hope you enjoy the large exodus of users. Should have just worked with him to add first party ads or something. And maybe not be greedy Elon clones. [] Mister_Hangman 391 points 392 points 393 points 9 hours ago (52 children) Dear /u/spez Sincerely get fucked. What an absolutely stupid business decision... but then again you clowns and your private equity bullshit. This is why digg v4 happened. Now youll see it happen here. [] SkyGuy182 3312 points 3313 points 3314 points 11 hours ago (37 children) Apollo is an absolutely amazing app that I use to judge all other iOS apps. I will sincerely miss this app the people who love it and Christians engagement with the community. My Reddit usage will almost certainly die alongside Apollo. Im pouring one out for you u/iamthatis . Thank you so much for such an incredible app. Id be lying if I said I wasnt crying a little right now. [] jerrymandarin 82 points 83 points 84 points 10 hours ago (7 children) Truly Apollo is so intuitive and so seamless. Other iOS app developers should aspire to create something as functional as Apollo. [] everythingisreallame 79 points 80 points 81 points 11 hours ago (12 children) Same Im trying to think of any other way Id get on Reddit besides my laptop on old.reddit but I cant think of any that I want to waste my time with on my phone. Been on with a handful of different accounts for over 11 years but Im probably done end of the month. [] ImJustAverage 79 points 80 points 81 points 11 hours ago (6 children) I exclusively use Reddit on Apollo. Once its gone thats probably going to be the end of Reddit for me unfortunately. This is an amazing app with a great developer that listens to his users and actually communicates with them. No other app has come close to the dedication and support. [] changelog Apollo Server Man 5840 points 5841 points 5842 points 11 hours ago * 2 2 2 & 9 more (117 children) Working with you on this has been nothing short of a blessing. Thank you for everything /u/iamthatis <3 Edit: I'd like to remind folks that Apollo does have a tipping function. I know Christian deserves every bit of support we throw his way. [] ap0phis 559 points 560 points 561 points 11 hours ago (12 children) Thank you . [] iamthatis Apollo Developer [ S ] 87 points 88 points 89 points 2 hours ago (4 children) Andr it's been an absolute delight. For the uninitiated he randomly DMed me on Twitter if I needed help on the server side of things at a time when I was at wits' end how to go any further with my server knowledge. I gleefully agreed and it's been a delight working together since I'm an iOS developer so backend server design isn't one of my biggest strengths and having someone who knew what they were doing on the server side of things was so so nice. [] lawofficeofbobloblaw 80 points 81 points 82 points 10 hours ago (4 children) Thank you for your service serverman! [] NotTheSicario 542 points 543 points 544 points 10 hours ago (58 children) This came up as a talking point on the floor I work on between everyone there during lunch. We discovered that around 60% use Apollo 15% use other third party apps and the remaining 25% dont use Reddit at all. The reason Reddit is doing this is because their investors want full control over what their users see and what they can do on Reddit. Considering Reddit only exists because of the content their users post this seems like theyre trying to destroy Reddit from within. During our discussion we came to a consensus that the likelihood of any of us using Reddit if the third party apps disappeared was basically zero. Reddits own app is lacking so many features and their new desktop UI seems like a garbage TikTok clone. Thank you for all the hard work Apollo has been doing since its inception and I wish you all the best in the future. [] ElectronGuru 155 points 156 points 157 points 10 hours ago (12 children) Its worse than that. Twitter and facebook pay their mods. Reddit mods are all volunteer. And many/most use apollo. All that free labor is about to go poof! [] RaDiOaCtIvEpUnK 41 points 42 points 43 points 8 hours ago (10 children) This is what seems the most shortsighted about all of this. Without subreddit mods the quality of subs are gonna go waaaay down. This in turn will drive people elsewhere. This in turn will cut into their revenue. Seeing as Reddit is so heavily dependent on its volunteer moderators I cant see this being anything but suicide on their part. The fact that theyre going publicly traded AFTER doing this just makes this even stupider. I cant see this as anything but pure idiocy in every way from them. How can a company be run so out of touch with how it operates? [] AkioDAccolade 74 points 75 points 76 points 10 hours ago (25 children) old.reddit.com is 100% on the block next no way they kill something like apollo which has so many users without killing old.reddit.com which avoids a ton of the new reddit bullshit. I dont see ads at all on my reddit feed because pihole+old.reddit.com [] Dyan654 2320 points 2321 points 2322 points 11 hours ago * (40 children) This is legitimately heartbreaking. I downloaded Apollo on the first day it was out of beta and its unequivocally my favorite app. Its one of the only third-party apps I use that is so well designed so emblematic of Apples HIGs that it might as well be a first-party application. As far as Im concerned its literally perfect. Im absolutely disgusted at the way Reddit has conducted themselves. Its embarrassing unprofessional and imo libelous. I think youd have a good case for a libel lawsuit especially given your documentation. I suppose this isnt a surprise given Reddits precious corporate bullshittery but this is a new low. /u/iamthatis - if the June 12th blackouts are effective in making Reddit change their course and they issue a full-throated apology would you be interested in continuing the apps operation? I get it if not - the way youve been treated is reprehensible - but its an important question to ask. Thanks again for all your work. If Apollo indeed shuts down I swear to god Ill never use Reddit again. May their IPO burn. IMPORTANT EDIT: As far as I can tell this post is being suppressed. Check out the different between when I'm logged in and logged out (same exact page 1 second apart): https://imgur.com/a/7qHnOyM . Could be because of this feature but it's hard to say. [] IndexZer0 85 points 86 points 87 points 8 hours ago (4 children) I'd love for this to be higher up so as to get an answer to this. It's a well worded question and I'm in total agreement the treatment of /u/iamthis being uncalled for. Apollo came with the receipts and proper math for how Reddit is just being disingenuous with their asks. I think I like this question because I'm still holding out hope for a miracle here. I guess the only social media I use is something I'm going to have to stop using. I mean...what now? Do I have to go socialize in the real world?!? That's not what's supposed to happen in 2023! [] Redtailcatfish 374 points 375 points 376 points 10 hours ago (10 children) This is way too far down. OP please consider talking to a lawyer. You may have a case here. This is extremely unprofessional at best and you have troves of evidence to support your claims. Regardless of your decisions on Apollo you will have to work in a world where your reputation has been damaged potentially forever after this. That's a big deal [] T-MoneyAllDey 63 points 64 points 65 points 9 hours ago (3 children) OP probably doesn't have the energy to care which is often a seasoned dev's outlook. [] Defying 4565 points 4566 points 4567 points 11 hours ago 2 (250 children) Their fucking claims of you blackmailing them is incredibly insane. Fuck this place [] disgruntled_pie 1616 points 1617 points 1618 points 10 hours ago (148 children) Yeah that sounds like it borders on defamation. Maybe Christian can still get $10 million out of this after all [] acelsilviu 1180 points 1181 points 1182 points 10 hours ago * (138 children) They were internal statements though. It might have been possible (and funnier) if hed waited for spez or an admin to publicly say that shit and only then hit them with the btw I recorded everything you said. Edit - that was wrong Reddit did publicly state that Christian threatened them . Hoooooo boy. Edit2 - as several people have pointed out the distinction seems to be legally irrelevant anyway thanks /u/Professional_Row340 /u/throwaway39402 /u/noturlawyer [] Manny-Both-Hanz 433 points 434 points 435 points 9 hours ago (68 children) Yeah and they even doubled down on it: Apollo threatened us said theyll make it easy if Reddit gave them $10 million. I am confused about this bullet point can anyone clarify what it actually means? Apollo threatened who? Where? And what does that $10M figure have to do with anything? BuckRowdy replied: Reddit is saying that when they approached the dev of Apollo about the changes he asked them to buy his app for $10 million. They characterized it as a threat which makes little sense. [] Jakeman52 342 points 343 points 344 points 9 hours ago (61 children) Legitimately should have taken his offer and fired their entire mobile team if they were smart. [] btk79 329 points 330 points 331 points 9 hours ago (46 children) 10 million for an app like Apollo is a FUCKING BARGAIN. They are incredibly dumb. The default Reddit app is ridiculous in comparison. [] cougrrr 47 points 48 points 49 points 8 hours ago (1 child) But if Spez and company buy the app for $10 million they can't then extort the creator of the app for $20 million so (taps forehead). [] SharkAttackOmNom 148 points 149 points 150 points 8 hours ago (12 children) Well Apollo is useless to them at any rate. Its not built to push ad supply algorithm data support test features like polls live streams whatever engagement optimizations theyre trying this month. They would have a hard time taking this app and bending it to their capitalistic will. Would end up running as shit as their own app. [] ElPlatanoDelBronx 76 points 77 points 78 points 8 hours ago (9 children) The dogshit UI on their app is the main issue. They would definitely make this one worse if they bought it but it would still be miles better. [] greenskye 57 points 58 points 59 points 8 hours ago (5 children) They've tried that before. All they did was run that app into the ground too [] MINECRAFT_BIOLOGIST 118 points 119 points 120 points 9 hours ago * (19 children) Well thread's archived now so there's a permanent record of what they said. Let's see how this shakes out. EDIT: Archive link - https://ghostarchive.org/archive/OQMGD Wayback Machine's being a little odd so I used Ghostarchive. [] werid 64 points 65 points 66 points 9 hours ago (16 children) spez have edited other people's comments before. [] [deleted] 1678 points 1679 points 1680 points 11 hours ago (224 children) Thank you for everything. Im deleting my 11 year old account. So long [] wedid 1054 points 1055 points 1056 points 10 hours ago (118 children) Bro actually did it [] NazTehRpR 489 points 490 points 491 points 10 hours ago (80 children) A lot of people are actually following through. [] Blarghnog 305 points 306 points 307 points 10 hours ago (50 children) Im just waiting until the 30th. But Ill be knocking out a half dozen accounts some of which are over 10 years old. [] NazTehRpR 77 points 78 points 79 points 9 hours ago * (7 children) I got 7 years on this platform and enjoy being on it. This app makes it possible & Its sad that this is happening. Hopefully something happens before than. But Im definitely seeing [deleted] in some places. [] [deleted] 78 points 79 points 80 points 9 hours ago (11 children) https://i.imgur.com/B3g5nGe.jpg [] Arvann 1879 points 1880 points 1881 points 11 hours ago (24 children) If Apollo goes it is goodbye reddit for me. I never would have thought I would have to let go of this amazing app. Thank you Christian for working on Apollo for the last 8 years and delivering an app that so many people loved. Sad to see it go. Greed is evil. [] Masterofunlocking1 389 points 390 points 391 points 11 hours ago (0 children) I want to thank you for making honestly the smoothest app Ive ever used. Its a shame they cant see the error of their ways already. I wish the best for you. [] tman612 8475 points 8476 points 8477 points 11 hours ago 2 6 2 & 5 more (433 children) I will stop using reddit on June 30 :( [] mininova721 3739 points 3740 points 3741 points 11 hours ago (154 children) There's no way I'll ever use their official app. I rather give up the site entirely. RIP Apollo. You were truly the best. [] GoingTibiaOK 1153 points 1154 points 1155 points 11 hours ago * (76 children) Same. I actually quit Reddit before until I found Apollo. I have no problems giving it up realistically Reddit has been wearing me down lately its all fighting and political shit now. [] LordTopley 504 points 505 points 506 points 11 hours ago (20 children) Same here. While I know I'll view it occasionally as so many of my Google questions end up as an answer on Reddit I shall be doing that logged out and with an ad blocker on. I will provide nothing to Reddit other than an occasional viewer that provides 0 revenue. [] stupid-donkey 85 points 86 points 87 points 10 hours ago (5 children) Yeah if /u/iamthatis had been given a reasonable cost and passed it on to users I would have paid and Reddit would have got some money out of me. Now Im only going to look at it on the desktop with ad blockers running when I want to look something up. [] MrC4meron 3123 points 3124 points 3125 points 11 hours ago 2 3 & 3 more (41 children) [] stoplookingusernames 393 points 394 points 395 points 11 hours ago (19 children) fuck reddit!!! im not gonna see my pets in my apollo app [] theFavbot 79 points 80 points 81 points 10 hours ago (12 children) Just downloaded the Pixel Pets standalone app [] SethRavenheart 5057 points 5058 points 5059 points 11 hours ago (215 children) Heartbreaking bye reddit [] [deleted] 11 hours ago (38 children) [removed] [] sumgye 107 points 108 points 109 points 11 hours ago (29 children) It is so sad. What Reddit has done is inexcusable. I only wish I could code and make my own Apollo using my own API key. [] Nick4753 1853 points 1854 points 1855 points 11 hours ago * (48 children) To make it even worse they lied about someone who even if they lowered the price would've been one of their single most significant sources of revenue as a company. Either they're enormously unprofessional and don't know how to run a business or they had no interest in keeping 3rd party apps alive. Or most likely both. There are many reasons to leave after all of this being malicious to the creator of one of their biggest apps is just one of the more painful ones. [] bodnast 616 points 617 points 618 points 11 hours ago (12 children) they had no interest in keeping 3rd party apps alive. I imagine they were never really negotiating in good faith. This rate increase was their way of saying gtfo [] sigtrap 105 points 106 points 107 points 10 hours ago (4 children) Absolutely. They didnt want to outright say theyre killing 3rd party apps so they just priced them out of business. [] kilobitch 294 points 295 points 296 points 10 hours ago (14 children) They dont want 3rd party apps. They want to funnel users through their app to collect and sell their data and to show them ads. [] azermanus 1005 points 1006 points 1007 points 11 hours ago * (72 children) Yeah Im done. Mods if you actually give a shit about this and want to affect effect change dont do a two-day blackout - shut your subreddits the fuck down. Make Reddit come to the table. This isnt just Apollo. This isnt just about third party apps. This is the beginning of the end of Reddit being anything resembling what brought us all here in the first place. Reddits execs are ready to hack this thing to hell in an attempt to make it a desirable product to sell. Maybe we make that hard since WE are the fucking product. [] boringhistoryfan 80 points 81 points 82 points 10 hours ago (35 children) I'm not a head mod for any subreddit. But I do mod a few. It seems to me that reddit could simply replace the mods on subreddits that close down and force them open again. Mods in general aren't a monolithic bunch. I'm sure there are plenty like me who are generally quite clueless about a lot of this API stuff. Or aren't tech savvy enough to be invested. And reddit will have no problems replacing mods till they find someone willing to open a subreddit back up I'd imagine. I'm just not sure if reddit would actually come to the table as it were on the issue. But I hope I'm wrong. [] azermanus 84 points 85 points 86 points 10 hours ago (20 children) It seems to me that reddit could simply replace the mods on subreddits that close down and force them open again. I think that would prove difficult. Not that arent others that would gladly take over some big subs but it could be a nightmare for Reddit to try to coordinate and moderate if the biggest subreddits suddenly shut down. Reddit is not Twitter. They dont have thousands of employees. They RELY on volunteer moderators and if coordinated users and mods can absolutely burn this site to the ground. [] mutt_rat 107 points 108 points 109 points 9 hours ago (7 children) They RELY on volunteer moderators Let's put this in real terms: they rely on unpaid labor that they're trying to leverage for an IPO so that people like Steve Huffman (spez) and all the venture capitalists that reddit took money from can cash in and fuck them out of their labor. [] Best-Expert 210 points 211 points 212 points 11 hours ago (24 children) I hope this begins the downfall of reddit hopefully you or someone else create an equivalent app for lemmy and make it mainstream. [] RE_Chief 1321 points 1322 points 1323 points 11 hours ago (60 children) Looks like June 30 will also be my last day on Reddit. Thanks for making and improving a world-class Reddit client. [] [deleted] 157 points 158 points 159 points 10 hours ago (41 children) Why wait? Im done now [] wickedlizerd 190 points 191 points 192 points 10 hours ago (36 children) Dude was NOT joking. Good for them [] gizmola 41 points 42 points 43 points 10 hours ago (11 children) Lmao. I still need some time to say goodbye. [] arthurcarver 2503 points 2504 points 2505 points 11 hours ago (38 children) Fucking greedy basts. Well Im sure I speak for everyone using this insanely amazing app my absolute favourite by a long long shot but thank you so much Christian. Big ups man. [] hariharan_mifsud 472 points 473 points 474 points 11 hours ago (31 children) It seems the intention since the beginning is following Twitter's move to kill third-party app and redirect all traffic to themselves and monetize user's data. [] bodnast 300 points 301 points 302 points 11 hours ago (23 children) Yep the 3rd party apps are first. Then old.reddit.com will come next. It's like they want us to use Reddit less [] jlusedude 214 points 215 points 216 points 11 hours ago (12 children) It will succeed. Im not going to use Reddit after this. I dont have other social media and dont need this. [] josh_is_lame 953 points 954 points 955 points 11 hours ago (45 children) obligatory fuck u/spez [] [deleted] 1153 points 1154 points 1155 points 11 hours ago (69 children) BRB deleting my Reddit account(s) [] hooplasandwiches 729 points 730 points 731 points 10 hours ago (11 children) madlad did it [] IFapToCalamity 246 points 247 points 248 points 10 hours ago (1 child) Based and (actually) red-pilled [] [deleted] 10 hours ago (28 children) [deleted] [] eggimage [ ] 602 points 603 points 604 points 11 hours ago * (21 children) well. looks like my reddit addiction is getting cured in less than a month. this really sucks. so sorry this amazing app has to end like this. reddit truly is garbage edit: reddit fuck you very much [] RSD94 8937 points 8938 points 8939 points 11 hours ago * 5 2 2 2 & 7 more (531 children) fuck reddit and fuck /u/spez thank you for everything Christian & co. <3 edit: thank you for the awards but please consider sending that money to Christian & other affected developers as a thank you for their hard work rather than reddit's soulless pockets. shoutout to r/ModCoord & r/RedditAlternatives [] anchoricex 1227 points 1228 points 1229 points 11 hours ago (139 children) Im absolutely tilted at that call recording. Honestly what a total dickhead. Hyped Christian got them receipts. [] WalkingCloud 649 points 650 points 651 points 9 hours ago * (85 children) I was already pissed off about this but /u/spez bold-faced lying on a call to moderators to try and spin things in Reddit's favour is absolutely disgraceful. Thankfully there is a recording because they were obviously banking on a he-said/she said situation. It's not a good look for a company that wants to charge developers millions of dollars per month for access. Would you trust a company with a CEO like that to send millions to? I sure as shit wouldn't. Honestly should be hounded about this on every announcement post he ever does. Edit: Holy shit they're doing an AMA lmao [] Thosepassionfruits 201 points 202 points 203 points 9 hours ago (26 children) That AMA may very well break EAs record for most downvoted post in the history of Reddit. [] Stig27 104 points 105 points 106 points 8 hours ago * (20 children) That was -167k I'd love to see it break -200k Edit: it was 667k let's go for a million then! [] Kami_no_mikoto 96 points 97 points 98 points 8 hours ago * (11 children) -167k? its more like -668k /u/spez you are gonna make history tomorrow again... in a VERY bad way. [] mrm00r3 45 points 46 points 47 points 7 hours ago (1 child) INB4 u/spez realizes this stands a good chance of getting him shitcanned. All they had to do was not be dickheads and goddamn if they didnt fail hard at that. [] urStupidAndIHateYou 74 points 75 points 76 points 9 hours ago (1 child) Somehow I bet Victoria Taylor gets fired again over this. Yes spez we remember this shit even when you sweep it under the rug. You run a massive ONLINE platform you think you'd understand object permanence at this point. [] PC_Master-Race 102 points 103 points 104 points 9 hours ago (16 children) Edit: Holy shit they're doing an AMA lmao oh my god I cannot wait. I REALLY hope more subs go dark for longer than 48hrs - and I'm sure this AMA will help that [] chauggle 79 points 80 points 81 points 9 hours ago (9 children) I predict he answers 2 pre-planned questions from plants then auto-replies watch rampart and dips. [] Realtrain 43 points 44 points 45 points 8 hours ago (8 children) Edit: Holy shit they're doing an AMA lmao Get ready for u/spez to start editing people's comments to be softball questions. [] ArnoldVonNuehm 3013 points 3014 points 3015 points 10 hours ago (190 children) Hey u/spez care to comment on the topic you lying piece of shit? [] ColoneISanders 2436 points 2437 points 2438 points 10 hours ago (102 children) If u/spez could read he'd be very upset right now. [] I_PUNCH_INFANTS 1350 points 1351 points 1352 points 10 hours ago (65 children) /u/spez cant hear you he is hiding in his doomsday bunker till this is over. [] moonsun1987 781 points 782 points 783 points 10 hours ago (30 children) /u/spez I wonder if /u/spez is considering quietly editing all our comments to say we love /u/spez because that's what he would do just to be clear fuck /u/spez [] McRibs2024 319 points 320 points 321 points 10 hours ago (27 children) Nah he will just edit the post and feel happy again. [] correcthorsestapler 296 points 297 points 298 points 10 hours ago (14 children) Yeah u/spez edit our comments. I fucking dare you. [] DaveedDays 50 points 51 points 52 points 9 hours ago (3 children) Dude could u/spez be any fucking stupider??? I remember when the whole comment edit debacle happened I remember the Ellen Pao debacle I remember them firing Victoria. Every time I don't think u/spez could stoop lower he does. Fuck you Steve. From the bottom of my heart as a decade+ user of this site fuck you. Your greed is going to tank this site that I once truly loved and enjoyed. [] ShinShinGogetsuko 375 points 376 points 377 points 10 hours ago (22 children) Careful he might stealth edit your comment. He's known to do that. [] Chicho_Procer 545 points 546 points 547 points 10 hours ago (18 children) He disabled pinging his username because he's a bitch who can't handle the slightest criticism. Still FUCK Steve Huffman. [] MarijuanaFanatic420 80 points 81 points 82 points 9 hours ago (6 children) I guarantee you he's read this. There's no way major app developer burns all bridges and personally calls out CEO of reddit doesn't hit his desk. He's just not going to comment on it until the lawyers have a chance. [] Chicho_Procer 48 points 49 points 50 points 9 hours ago (3 children) Good. Steve if by any chance you stumble upon this comment: If you love Prepping so much then prepare a traffic cone with lube and stick it up your ass. [] Mylaptopisburningme 36 points 37 points 38 points 10 hours ago (2 children) Steve Huffman /r/punchablefaces [] R0hanisaurusRex 684 points 685 points 686 points 10 hours ago (37 children) I heard /u/spez is being questioned by the DOJ about his connection to Jeffery Epstein. [] CrepeTheRealPancake 330 points 331 points 332 points 10 hours ago (12 children) He'll probably just edit your comment like he's been known to do. [] BovineJoniHimself69 266 points 267 points 268 points 10 hours ago (10 children) i love /u/spez and He is thE best and he's totalLy not Pointing a gun at me now [] Teknoeh 50 points 51 points 52 points 10 hours ago * (9 children) I heard /u/spez is and will always continue to be a gigantic piece of shit whos entire legacy will be running Reddit into the ground. Year after year one bad business decision after another. Just another Elon except Spez hasnt done anything even remotely useful for the planet. [] elkend 58 points 59 points 60 points 10 hours ago (7 children) Every post we post on Reddit should now end with fuck /u/spez like a forum signature. fuck /u/spez [] [deleted] 275 points 276 points 277 points 11 hours ago (7 children) Deleting my Reddit account now I could wait till June 30 but it feels wrong to keep using the site now that we know for sure what theyre doing to Apollo. Its been a great run and thanks Christian for all the work over the years. Will miss the communities a lot. Stay awesome guys [] KaRue3 46 points 47 points 48 points 10 hours ago (0 children) So long best to you! [] cocoiadrop_ 3244 points 3245 points 3246 points 11 hours ago (153 children) I don't use the app so I'll give you the best answer I can -- he does scraping so that he can deliver notifications faster but has done NO EFFORT to be a good citizen of the internet. spez joins the r/LeopardsAteMyFace alumni. [] Runelite_HD 1220 points 1221 points 1222 points 10 hours ago * (36 children) LMAO spez Youre telling me the guy in this interview isnt a good citizen of the Internet? Think your fedora is on too tight m8. [] geraldisking 307 points 308 points 309 points 9 hours ago * (23 children) Remember the article where Spez fantasized about a future dystopian world where he would be one of the leaders? Guys a nut job. Edit Heres the article for those asking. https://www.newyorker.com/magazine/2017/01/30/doomsday-prep-for-the-super-rich Edit edit: the part that matters from a very very long but good read. TLDR: Huffman has calculated that in the event of a disaster he would seek out some form of community: Being around other people is a good thing. I also have this somewhat egotistical view that Im a pretty good leader. I will probably be in charge or at least not a slave when push comes to shove. [] AhmCha 126 points 127 points 128 points 8 hours ago (9 children) Lol hed drink bad water and shit himself dead within days [] geraldisking 48 points 49 points 50 points 8 hours ago (5 children) I guess he fancies himself as a survivalist and someone people would follow. Its probably some of the most neckbeard shit Ive ever read. [] present_absence 54 points 55 points 56 points 10 hours ago (10 children) Being a bad citizen of the internet means using the api normally I guess lol [] aishik-10x 39 points 40 points 41 points 7 hours ago (3 children) Steve Huffman is NOBODY who should be talking about being a good citizen of the Internet. This is the same fucker who secretly edited other peoples comments which criticized him. Nobody even knew that was possible before this. The CEO of Reddit being a prime loser and powertripper. [] Kdegeek 536 points 537 points 538 points 11 hours ago * (29 children) Welp. I guess I will no longer be using Reddit. My cakeday is October 31st 2012 so heading towards 11 years. June 30th will be my last sign in if this goes through. I love this site and the community on it but I cannot support Reddit leadership on this. u/spez the way you and the rest of Reddit's leadership team have handled this has been incredibly cruel stupid and tone deaf. If this is how you want Reddit to be run then you don't want me. *edit to add: I also signed up for the PixelPals subscription. Not really my thing but Christian has it turned off for me to up my Apollo sub to a year (so I could refuse the refund Lol). I know this is gonna be a lot of money out of his pocket for something that wasn't his fault. If you like the idea of pixellated pets or want to help a bit in offsetting his sudden $250k out of pocket bill please consider downloading and subscribing :) [] shlem90 1149 points 1150 points 1151 points 11 hours ago (139 children) I left Twitter when they killed Third Party Apps like Tweetbot. I will do the same here. Thanks for making a great App and making this site user friendly more than Reddit ever could. [] jack0rias 1144 points 1145 points 1146 points 11 hours ago (43 children) Fuck /u/spez and Reddit in general. They're insane if they think their plan is going to work out. Furthermore fuck /u/spez even more. [] doctor_who_17 154 points 155 points 156 points 10 hours ago (19 children) Fuck /u/spez Didnt think it was possible to be an even bigger cunt [] GBLoveSZN 1961 points 1962 points 1963 points 11 hours ago (129 children) /u/spez is an absolute piece of shit. Hope there are bad things coming his way. Do your job karma. [] GoatboyTheShampooer 1088 points 1089 points 1090 points 10 hours ago * (100 children) u/spez needs to resign. He continually fucks up over and over; and now this. EDiT: Annnd he's shitting himself: https://old.reddit.com/r/reddit/comments/144ho2x/join_our_ceo_tomorrow_to_discuss_the_api [] PinStacker 540 points 541 points 542 points 10 hours ago (40 children) Dude was caught changing messages and survived. Hence people recognizing this place is a time-waste cesspool. I am glad it's going to blow up in their faces trying to monetize it. [] MATHIL_IS_MY_DADDY 119 points 120 points 121 points 10 hours ago (4 children) lmao i remember that t_d fiasco was crazy. admin fiddling with messages was a wtf moment for me [] CheckOutMyPokemans 73 points 74 points 75 points 9 hours ago (11 children) Lmao doesn't even give a time for the AMA. Amazing stuff. [] 390TrainsOfficial 79 points 80 points 81 points 9 hours ago (7 children) It'll be during the period when Reddit has fewer visitors and the admins will be censoring the hell out of it because how dare the precious u/spez face the reality: he's made a terrible decision destroyed one of the most popular third-party Reddit clients and caused a planned temporary shutdown of a massive number of subreddits (with several more subreddits closing permanently). [] Yorktown2016 472 points 473 points 474 points 11 hours ago (11 children) /u/spez you sound like a complete asshole and I will leave Reddit after the shutdown date. Unbelievable [] Dead_Politician 228 points 229 points 230 points 11 hours ago (13 children) Holy shit. End of an era. I hate to read this. [] WishMyHusbandHadAJar 233 points 234 points 235 points 11 hours ago (3 children) Fuck you u/spez [] Cheechers23 222 points 223 points 224 points 11 hours ago (4 children) Man this has really killed my mood. Reddit really is trying to villianize Apollo and its disgusting [] Hmesce42 828 points 829 points 830 points 11 hours ago (18 children) looks like u/spez is taking this personal! will be my last weekend using the website. [] ErikElevenHag 59 points 60 points 61 points 10 hours ago (2 children) u/spez is Tobey to my Michael [] vxx 45 points 46 points 47 points 10 hours ago * (4 children) He's lying if he thinks it might help him. He has done it to me before. Even sworn. He's the type that would fake crying to dodge tough questions. He's unfit to lead a company and if reddit tries to survive he needs to step down in my humble opinion. I doubt he's been acting in the best interest of the company at any point as a CEO. [] iamstevesteyn 962 points 963 points 964 points 11 hours ago (9 children) My day is ruined. [] sigtrap 382 points 383 points 384 points 10 hours ago (3 children) And my disappointment is immeasurable [] Car333 1458 points 1459 points 1460 points 11 hours ago (32 children) Very sorry to hear this. I guess that means my Reddit consumption will go to 0 now _()_/ [] bodnast 367 points 368 points 369 points 11 hours ago (15 children) Yeah my mobile reddit consumption will be zero just like when Twitter killed tweetbot. Once they kill old.reddit.com I'll be done with reddit on desktop. So frustrating [] anakinfan8 190 points 191 points 192 points 11 hours ago (1 child) Well this is a post I had hoped to never read. Im gobsmacked that the Reddit head honchos can be this inept. Christian thanks for literally everything youve done man. [] tao_jones 382 points 383 points 384 points 11 hours ago (5 children) Just tweeted this to you but Ill say it again here: Apollo was truly one of the greatest apps ever made for iOS. Unparalleled in quality. I will be so incredibly sorry to see it go. That spot on my main Home Screen will be hard to fill. Thank you for everything! [] ckelley87 707 points 708 points 709 points 11 hours ago (21 children) I am so sorry to read this. :( Apollo is the only way I use Reddit on my mobile devices. Get bent /u/spez [] SuperbBit 10.4k points 10.4k points 10.4k points 11 hours ago * (254 children) Oh no. I am so sorry for you and all of the fans of Apollo! Thank you for all your hard work on the Apollo app. This really was one of the best apps ive ever used. Take care! And please dont refund. Let Christian keep what he deserves! :) Edit: please stop giving awards! The money will go to Reddit. They dont deserve that! [] Best-Expert 4484 points 4485 points 4486 points 11 hours ago 9 4 3 4 2 2 2 & 20 more (214 children) People should stop giving awards to this post. Everyone please Stop giving reddit your money. [] sumgye 1400 points 1401 points 1402 points 11 hours ago * (102 children) Seriously. We should all pitch in and buy Apollo and make it open source so we can use our own API keys. Or if its not serving /u/iamthatis anymore and he isnt making his living on it any way he could open source it for free? [] andylshort1 729 points 730 points 731 points 10 hours ago * (62 children) Its only a matter of time before they would take that away from you too. They dont care about third-party applications and would sooner price you out or end support for them to get you to use their own shitty slow buggy mess of an app. Dont give them the satisfaction. [] Chewcocca 54 points 55 points 56 points 10 hours ago (8 children) People always assume that awards = money I have thousands of coins on my account that I didn't pay for. You used to get coins every time someone awarded you. I haven't used them because I think awards are pretty stupid but they don't automatically mean money is being spent. [] AmishAvenger 106 points 107 points 108 points 10 hours ago * (12 children) This is insane. What theyre doing is bad enough. They acted in bad faith the entire time and now theyre outright lying and attempting to slander someone. How is it that theyre unable to comprehend that their entire site their entire business model revolves around the users ? No users no Reddit. No users no IPO. I say all the subreddits participating in the two-day blackout period just permanently shut down. [] 7plysMaple 156 points 157 points 158 points 11 hours ago (5 children) Thanks for the great UX and your dedication. I guess Ill start reading books instead of mindless scrolling. [] soaringSpriggan 156 points 157 points 158 points 11 hours ago * (12 children) Well that does it. Ill be deleting all my reddit content on each account and then deleting each account so that I leave nothing behind on the site. I recommend each 3rd party app user do the same. Fuck Reddit its time to move on to something else and maybe overall spend less time looking at screens. I suppose this will be my last comment. Ill leave my content up until the Apollo blackout. One last edit: here is how you can delete your content: https://github.com/j0be/PowerDeleteSuite Thanks for everything Christian. As a fellow software developer I can only imagine how shitty this is for you. If theres somewhere like Mastodon that I can follow your future work please link me to it. I am interested to see what you get up to in the future. One final edit (this time fr). Ill be on https://kbin.social/ instead for my Reddit fix. The fediverse is the future. I hope to see everyone here somewhere there. [] aruke- 656 points 657 points 658 points 11 hours ago (31 children) Sad. Investor greed at it again. [] spasticpat 139 points 140 points 141 points 10 hours ago (12 children) Just a reminder that /u/spez edits comments that hurt his feelings https://www.theverge.com/2016/11/23/13739026/reddit-ceo-steve-huffman-edit-comments [] JustAnotherArchivist 602 points 603 points 604 points 11 hours ago * (38 children) Two minor things: https://github.com/christianselig/apollo-backend and https://christianselig.com/apollo-end/remaining-icons.png are currently (17:32 UTC) 404s. Edit: The GitHub repo started working at around 18:42 UTC. The icons are still down as of 19:05. Edit: The icons have also started working as of about 19:08 UTC. Thanks for the fix Christian! [] ownage516 203 points 204 points 205 points 10 hours ago (23 children) Really disappointed in spez. Hopefully that IPO money fills the part where his soul was [] Lenininy 606 points 607 points 608 points 11 hours ago (48 children) The subreddits should go dark indefinitely until they walk back everything. [] yeahburyme 243 points 244 points 245 points 11 hours ago (5 children) Subreddits that have alternative communities should allow only top level posts as link posts from those communities and no comments on reddit. Helps people move over. [] coolaaron88 119 points 120 points 121 points 11 hours ago (0 children) Wow the fucking End of an era. Never thought it would do down like this but thank you for being as transparent as you've been through all of this I know its been a back-and-forth battle trying to fight the good fight but I understand that you can only do so much. thank you Christian for everything youve been an amazing developer. [] LordTopley 241 points 242 points 243 points 11 hours ago (7 children) Thank you for your efforts over the past 8 years. You inspired me to learn Swift and build apps. I made 2 apps that went into the store. While they aren't there anymore as I don't have time (started my own company and also became a Dad) I was immensely proud of myself for making those apps. Making those apps reignited my enjoyment for making things and I restarted my hobby of making websites. Today I run my own freelance web design company and I partly owe that to being inspired by your work. Good luck in the future and I hope to see new apps from you in the future. [] iamthatis Apollo Developer [ S ] 71 points 72 points 73 points 2 hours ago (1 child) You're crushing it congratulations and well done. It's sometimes kind of isolating being an indie developer but comments like these really make you realize how we can all affect one another in really cool ways. Thanks for commenting :) [] _jamieb 112 points 113 points 114 points 11 hours ago (3 children) Fuck this is sad to read. Two of my favourites apps (tweetbot was the other) going in such a short space of time is a kick in the balls. Ive not read the comments yet but if by some miracle Reddit did a 180 after the blackouts on June 12th would you keep Apollo going? I know if I was in your shoes I wouldnt as it seems the relationship as become a bit toxic thanks to their actions and Id be constantly worried about the rug being pulled from under me again. [] pouwerkerk 114 points 115 points 116 points 11 hours ago (1 child) Thank you /u/iamthatis for all the incredible work you put into Apollo for showing what independent iOS developers are capable of building and for so clearly explaining why the Reddit API changes are unsustainable for businesses that use them. I have loved using Apollo and will miss it dearly. [] Kirihuna 112 points 113 points 114 points 10 hours ago (3 children) I refuse to refund. You can keep my money and Reddit can lose my money. I'll find a new platform for sports porn programming and other subjects. Thank you for your hard work over these years. Your app has helped me in many areas of my life from mental health (being able to access subreddits for it without having to use old.reddit on Safari or using the shitty App) to changing careers and everything in between. And wasting a shit load of my time that I could be productive lol. I hope you find peace in the next step of your journey. [] Ninjaguy5700 103 points 104 points 105 points 11 hours ago (1 child) I cannot express how sorry I feel for you and all of the hard work you put into Apollo. Thank you for creating the best third-party Reddit app! [] xezrunner 108 points 109 points 110 points 11 hours ago (2 children) It is incredibly heart-breaking to see that call being misinterpreted by the CEO who then immediately apologized yet continued to frame you as the one that threatened them... Reddit has become an incredibly hostile company in my eyes and no doubt them going IPO is probably the reason for their latest actions. This is simply a cut-and-clear example of what happens when a company shifts their focus from the product onto money. I'm really sorry to see Apollo disappear especially because I was thinking of subscribing the past few days even if purely to support the development of the app since it provides so much of a better experience to me on mobile. [] ulkesh 305 points 306 points 307 points 10 hours ago (16 children) If Reddit allows this to continue my 15-year account will be deleted and I will never return. Fuck /u/spez and fuck anyone at Reddit who is allowing this to happen. I helped Reddit become what it is today as did millions of other users. Christians app also helped Reddit become what it is today. Reddit will die the death it deserves because of this and good riddance. Make no mistake charging for the API isnt an issue. Egregiously charging for the API is and the timeline they have pushed is. They are greedy pieces of shit plain and simple. They could have handled this so much better with logic and reason. Instead they decided to screw everyone over. What a way to completely ruin your brand Reddit. I feel bad for any decent people working there who will be losing their jobs from all the lost revenue this will clearly incur. But in the end they will be better off not working anymore for complete asshats. Thank you Christian for the best application iOS has ever seen. I wish you well. [] JusticeBonerOfTyr 383 points 384 points 385 points 11 hours ago (9 children) This is the best app by far for Reddit. I understand this decision but it freakin sucks that it had to come to this. Bullshit Reddit wouldnt be anything or make any money if it wasnt for the users creating and sharing content. Reddit is also going to be much worse now that moderators arent going to have the proper tools now to moderate since the official app is such dog shit. [] DragonDropTechnology 78 points 79 points 80 points 11 hours ago (6 children) This was my assumption that Reddit tolerated (even supported) third party apps because they generate a significant amount of FREE content. Its similar to the no-annual-fee credit cards that can offer cashback. Credit card companies charge merchants; Reddit can serve ads (on the aforementioned FREE content) to some large percentage of users. Fuck these venture capitalistic scum. [] gforce216 94 points 95 points 96 points 11 hours ago (1 child) This was a wonderful app and the reason I even got into Reddit. I wish you best of luck in the future! And just as a light-hearted joke in this timeAny update on when well be getting that iPad update? [] SecuredMirrors 91 points 92 points 93 points 10 hours ago (1 child) And to the surprise of absolutely fucking nobody Reddit took a good thing ruined it then made it worse by trying to make u/iamthatis out to be the bad guy. Fuck u/spez and Fuck Reddit. [] redburn22 92 points 93 points 94 points 10 hours ago (2 children) Reddit should address these serious allegations. This post makes it very clear that a lot of what they said has been a lie assuming that the recordings are authentic which I have absolutely no reason to doubt. The Apollo developer has done such a good job of demonstrating how disingenuous theyve been throughout this process. If Reddit had openly declared a ban on third-party apps despite the negative impacts on this developer and their customers I could respect that decision. Companies sometimes make such moves and Reddit has the right to explore profit-making avenues including consolidating usage to their app to potentially enhance user experience. What I cant accept is how they have chosen to lie to their customers (and defame their partners) to avoid PR blowback against their API changes. This not only shows a total disrespect to their customers but also shows how incompetent they are. When implementing an unpopular policy a well run company considers amending the policies to be less disagreeable or they just make the change openly and transparently and accept and weather the backlash. They don't attempt to dodge criticism by lying. And thats not even a moral thing. How incompetent are they to not realize that these public lies would be exposed. To me it's not just wrong but also weak given the transparency of the lie. Worse still is Reddit's blatant defamation of the Apollo developer which seems to have been a calculated move to defend their narrative. Such behavior whether it was intentional or it was a repeated mistake that theyve chosen not to correct is inexcusable. As a result I've decided to limit my Reddit usage resorting to searching for information with an ad blocker turned on. I'll reconsider my stance when they confront these claims reverse these policies or offer a truthful explanation of their actions. [] Mudkip-Mudkip-Mudkip 94 points 95 points 96 points 9 hours ago (2 children) I don't use Apollo but I respect your transparency. Considering the state of Reddit right now I took the initiative to archive your post text in case the admins decide to delete it: [] DinkyDonahue 180 points 181 points 182 points 11 hours ago 2 (13 children) COMPLETE BOYCOTT OF BUYING COINS must occur. Not a one from anyone going forward. When you buy an award you award greed. [] MatchkeY 73 points 74 points 75 points 11 hours ago (1 child) Fair play. Thanks for all the hard work. Sorry it ended this way for you. [] rafaqueque 72 points 73 points 74 points 10 hours ago (1 child) Thank you for the amazing work /u/iamthatis . I wish you success. Fuck Reddit fuck /u/spez and fuck every greedy investor. Please just go fuck yourself honestly. [] toolman10 726 points 727 points 728 points 11 hours ago (41 children) Also as an Apollo Ultra Lifetime member I will not be requesting a refund. If anyone else does they are not helping this situation for Christian. What a sad fucking day. [] matttopotamus 160 points 161 points 162 points 10 hours ago (9 children) Definitely no refund needed. Im in the camp that would have actually paid the monthly fee to continue using this app. [] mrhindustan 57 points 58 points 59 points 9 hours ago (6 children) I wish I could input my own API access token to continue. I bought Ultra lifetime but wish I could just pay Christian monthly as Apollo is just that good. [] Eminem_McFlurry 40 points 41 points 42 points 9 hours ago (4 children) Yeah if Reddit was legitimately interested in charging reasonable amounts for API usage the price would be per user and Apollo would just act as a middleman. If I go over some threshold charge me like $5/month or something. [] Zachrdoodle 67 points 68 points 69 points 11 hours ago (5 children) The Reddit API changes seem more and more like theyre just obviously and blatantly trying to exterminate 3rd party apps. Probably so they can maximize add revenue and app traffic on the official app? :/ [] RobertoPaulson 67 points 68 points 69 points 10 hours ago (3 children) Id respect them a lot more if they just came out and said they were killing api access for third party apps for financial reasons. Killing them via pricing them out and then playing the victim is just fucking slimy. [] toolman10 69 points 70 points 71 points 11 hours ago (1 child) Apollo is one of the best apps I have on my phone Christian. One I use several times per day--it just works and it's a downright beautiful app. I understand why you are shutting it down (thanks for the huge post above) but it's a travesty. Greed is taking one of the best social media/forum apps ever created and killing it dead. The Reddit app is horrific. I won't use it. The smart move would have been to buy Apollo and let you continue development of it. June 30: RIP Reddit mobile. [] dan-80 295 points 296 points 297 points 11 hours ago * (37 children) Will you build a competitor? Move to one of the existing alternatives? Ive received so many messages of kind people offering to work with me to build a competitor to Reddit and while Im very flattered thats not something Im interested in doing. Im a product guy I like building fun apps for people to use and Im just not personally interested in something more managerial. These last several months have also been incredibly exhausting and mentally draining I dont have it in me to engage in something so enormous. So sorry to hear that. I hope than one day you will consider a client for a Reddit alternative like Kbin or Lemmy . So long and thank you for the last 6 years. [] intercommie 57 points 58 points 59 points 11 hours ago (0 children) Theres nothing to say other than thank you. Youve done a great job with the app. [] HippolyteClio 55 points 56 points 57 points 11 hours ago * (0 children) Reddit: I think one thing that we have tried to be very very very intentional about is we are not Elon were not trying to be that were not trying to go down that same path. [] We are trying to do is just use usage-based pricing that will hopefully be very transparent to you and very clear to you. Or were not trying to go down the same path that you may have seen some of our other peers go down. Sounds like we know who to compare them to even more. [] djaiss 199 points 200 points 201 points 11 hours ago (3 children) NOO GOD! NO. GOD. PLEASE. NO. NO!!! NO!!! NOOOOOO!!! [] Spartan-417 144 points 145 points 146 points 11 hours ago (7 children) Hey Christian People have floated the idea of setting up Apollo to use personal API keys I wish we could get that kind of thing set up so we can keep using this the best client for Reddit full stop But I have absolutely no idea how to implement such a thing into the app or if it is even feasible In any case good luck for the future Your skill in making this should help you well in finding future jobs [] PazuzuOvBabel 49 points 50 points 51 points 11 hours ago (3 children) Am sad. Never thought I will feel that sad about losing an app. _ Thanks for the ride u/iamthatis [] Uber_Nerd 89 points 90 points 91 points 11 hours ago * (3 children) Thank you for everything. Fuck Reddit! Edit: Sucks too since Apple just featured Apollo in the WWDC presentation. [] wosmo 49 points 50 points 51 points 9 hours ago (2 children) Not for the first time either. [] bedstuybk 51 points 52 points 53 points 11 hours ago * (0 children) Ive been a beta tester for Apollo since the start and Ive always loved your app Christian. I cant imagine my day without it but Ill have to now. Thanks for the years of tireless service and I hope we meet again. So long and thanks for all the fish. Edit: as for Reddit: I dont know who you are anymore. I think it's time we see other people. [] MudhornsRefuge 47 points 48 points 49 points 10 hours ago (4 children) Shit I would be willing to pay up to $20 a month if it meant I could keep using Apollo. As a recovering addict there's a few subs that have been very beneficial and helpful to me. But I can't continue to use them in good conscience after all this. Not to mention the Reddit app crawls a fuck load of your private info. July 1st I'll be deleting my account and all my throwaways. Some being almost 10 years old. Sad day. [] Silmar9 43 points 44 points 45 points 10 hours ago (0 children) fuck you u/spez [] pm326 44 points 45 points 46 points 10 hours ago (0 children) What a narcissistic piece of dogshit you are /u/spez [] tyrannosaurus_r 44 points 45 points 46 points 10 hours ago (1 child) Reddit admins but particularly /u/spez I know you're not going to see this but it makes me feel better to say and will hopefully inspire others. This could have just been a shitty business move but the dirty games here are just genuinely remarkable for how uniquely shitty they are. Huffman you're a massive cunt and I look forward to your IPO failing hilariously as reddit plunges from the last successful Web 2.0 social media site to a failed competitor to TikTok. Thanks for embodying enshittification and remember that there are so very many people who will actively celebrate your failure-- I among them. If this is ever a just society I look forward to your suffering the same fate you're inflicting on the Apollo team. [] _your_face 39 points 40 points 41 points 10 hours ago (3 children) U/spez has always been a piece of shit. Confirmed when I met him in person. 100% turd burglar [] _garyboy 43 points 44 points 45 points 9 hours ago * (1 child) Just wanted to throw in my experience that might get lost in the fray of this popular thread I am a product manager for a mobile app now and I 100% entered this industry because of how much Apollo inspired me years ago when I first downloaded it Christian. Apollo is the essence of everything a native iOS app should be. It integrates seamlessly into every experience on my phone but only as much as I want it to. It empowers me to configure it with as much (or as little!) granularity as I want. It follows HIG beautifully and provides an experience that feels truly native. Its full of delightful flourishes that make it a joy to use (mourning the death of my Apollo pet widgets right now most of all). Its the gold standard without a doubt. Not to mention the fact that youve taken so much time to listen to your users and break all of your decisions down in such detail that we all trust the choices you make for the app. I think its often underestimated by folks outside of this world just how hard it is to get a user-base to trust an app developer. Its an amazing feat that youve pulled off for years. Thanks for the amazing work and for truly genuinely being my product idol dude. I hope to one day make an app thats half as good as what youve done here. [] catsupatree 76 points 77 points 78 points 10 hours ago (3 children) Fuck u/spez Go ahead and edit my comment [] [deleted] 11 hours ago (28 children) [removed] [] iamthatis Apollo Developer [ S M ] 457 points 458 points 459 points 9 hours ago (19 children) While I appreciate the passion about Apollo I dont want this to escalate into threats of violence against anyone (implied or otherwise and even if joking) so Im removing this. Lets please steer clear of that everyone. [] [deleted] 8 hours ago (3 children) [removed] [] [deleted] 8 hours ago (1 child) [removed] [] dingdongfootballl 49 points 50 points 51 points 10 hours ago (0 children) Eat shit u/spez [] PurpleEsskay 74 points 75 points 76 points 10 hours ago (3 children) Looks like June 30th is my last day on reddit then. Fuck you /u/spez And a special shoutout to the reddit admins who are complicit in killing the platform. Make no mistake your IPO is now utterly fucked. /u/reddit /u/Sporkicide /u/Acidtwist /u/sodypop /u/redtaboo /u/Chtorrr /u/skwitz /u/sticksfigured /u/TheOpusCroakus /u/enthusiastic-potato /u/Significant-Otters /u/agoldenzebra /u/lift_ticket83 /u/redditcma /u/Goldennuggets-3000 /u/SingShredCode /u/CorrectScale /u/CookiesNomNom /u/EilenTheCrow /u/SpookyQubit /u/JabroniRevanchism /u/RyeCheww /u/Why_So_Sagittarius /u/PossibleCrit /u/ashtena7 /u/cerastes /u/Cosmic-Catz /u/flyonaWarhol /u/Cardioclaw /u/BurritoJusticeLeague /u/batballoon Use of this site constitutes acceptance of our User Agreement and Privacy Policy . 2023 reddit inc. All rights reserved. REDDIT and the ALIEN Logo are registered trademarks of reddit inc. Rendered by PID 99241 on reddit-service-r2-whoalane-546fc4b46d-qcvtr at 2023-06-09 05:00:31.711560+00:00 running 7a5c034 country code: US.
null
BAD
Appalachian Apple hunter who rescued 1k lost varieties (2021) (atlasobscura.com) As Tom Brown leads a pair of young aspiring homesteaders through his home apple orchard in Clemmons North Carolina he gestures at clusters of maturing trees. A retired chemical engineer the 79 year old lists varieties and pauses to tell occasional stories. Unfamiliar names such as Black Winesap Candy Stripe Royal Lemon Rabun Bald Yellow Bellflower and Night Dropper pair with tales that seem plucked from pomological lore. Take the Junaluska apple. Legend has it the variety was standardized by Cherokee Indians in the Smoky Mountains more than two centuries ago and named after its greatest patron an early-19th-century chief. Old-time orchardists say the apple was once a Southern favorite but disappeared around 1900. Brown started hunting for it in 2001 after discovering references in an Antebellum-era orchard catalog from Franklin North Carolina. Detective work helped him locate the rural orchard which closed in 1859. Next he enlisted a local hobby-orchardist and mailman as a guide. The two spent days knocking door-to-door asking about old apple trees. Eventually an elderly woman led them to the remains of a mountain orchard thatd long since been swallowed by forest. Brown returned during fruiting season and used historic records to identify a single gnarled Junaluska tree. He clipped scionwood for his new conservation orchard and set about reintroducing the apple to the world. Brown has dozens of apple-hunting tales like these from the nearly 25 years hes spent searching for Appalachias lost heirloom apples. To date he has reclaimed about 1200 varieties and his two-acre orchard Heritage Apples contains 700 of the rarest. Most havent been sold commercially for a century or more; some were cloned from the last known trees of their kind. These apples belong to the [foodways] of my grandparents and great-grandparents generations says Brown who was raised in western North Carolina. Thousands of varieties probably still exist but saving them is a race against time. The people who hold clues about their locations are typically in their 80s or 90s. Each year trees are lost to storms development beetles and blights. Brown has devoted his later years to beating the clock. Ironically Brown didnt know what a heritage apple was until he stumbled on them at a historic farmers market in 1998. There was a little stand with a bunch of strange-looking apples laid out in baskets says Brown. Colors ranged from bright green to yellow-streaked sunset pink and purplish black. Some were plum-sized others as big as softballs. They had names like Bitter Buckingham White Winter Jon Arkansas Black and Billy Sparks Sweetening. Tasting trays brought a smorgasbord of flavors and textures. Brown tasted Jonathans that had ros wine-colored flesh. Rusty Coats were soft like pears and sweet like honey. The mammoth Twenty Ounce was crisp with a tart peachy finish. Semi-firm Etters Gold brought peony bouquets and grape flavors. Grimes Golden were sweet with a hint of nutmeg and white pepper. Browns enthusiasm led to a conversation with the vendor late orchardist Maurice Marshall. The varieties of apples he was selling were standardized in the 1700s and 1800s and had vanished from commercial circulation by 1950. Marshall had obtained most of the scionwood for them from elderly mountain homesteaders. But two or three varieties came from clippings taken during apple-hunting expeditions at the ruins of old orchards. Whats more hundreds of lost apples could likely be reclaimed at similar sites throughout Appalachia. That part stayed with me says Brown. I kept thinking: How neat would it be to find an apple nobodys tasted in 50 or 100 years? Then it struck him: Had so many interesting great-tasting fruits really just disappeared? It seemed impossible. Brown threw himself into researching the history of Appalachias heritage apples. What he learned was awe-inspiring and devastating. Commercial orchards in the U.S. grew about 14000 unique apple varieties in 1905 and most of them could be found in Appalachia says William Kerrigan author of Johnny Appleseed and the American Orchard and a professor of American history at Muskingum University. The diversity was rooted in early colonial precautions. Water wasnt always safe to drink and episodes of sickness from contaminated water gave that substance a questionable reputation says Kerrigan. Fermented beverages were the go-to alternative. Importing wine was expensive and native pests killed Old World grapes. Apple orchards were easier to maintain and more utilitarian than growing fields of barley for beer so cider became the colonists choice beverage. By the mid-1700s virtually every East Coast farm and homestead had an apple orchard. The settlement of Virginias Blue Ridge Mountain region spurred an innovation boom. High-but-not-too-high elevations hot humid summers and rich deep soil nurtured by consistently rainy winters produced ideal growing conditions writes Kerrigan in Johnny Appleseed and the American Orchard . By the early 1800s the Shenandoah Valley had become the top U.S. growing region. Commercial orchards were proliferating throughout eastern Appalachia. Experimentation was relentless. Growers did things like cross tannin-rich indigenous crabapples with Old World cider staples writes Kerrigan. The efforts produced new varieties such as the Taliaferro which Thomas Jefferson championed as the worlds greatest cider apple. But apple varieties were cultivated for more than cider. For Appalachian farmers and homesteaders a diverse orchard was fundamental to survival and good-eating alike says Brown. Residents were expert gardeners and developed varieties that matured at different intervals tasted unique and catered to specific culinary functions. The goal was to be able to pick fresh apples from June to November and have a diverse supply of fruit throughout the year says Brown. Thick-skinned late-ripening varieties provided wintertime pomaceous treats. Others were tweaked for applications such as frying baking dehydrating making vinegar and finishing livestock. Apples were the gardens crown jewels says Appalachian Food Summit co-founder and renowned chef Travis Milton. People took pride in having something unique to brag about to their neighbors. But Appalachian traditions around heritage apples were eroded and ultimately destroyed by urban migration factory farming and corporatized food systems. Conglomerates negotiated national contracts and switched to apples that matured fast and were suited to long-distance shipping. By 1950 most smaller orchards had been forced out of businessMiltons grandfather for instance sold the familys Wise County Virginia orchard to a coal company to save his cattle farm. Gardens began to disappear. By the late 1990s U.S. commercial orchards grew fewer than 100 apple varietiesand just 11 of them accounted for 90 percent of grocery-store sales. Experts estimated 11000 heirloom varieties had gone extinct. It upset me to learn about that says Brown. Two-hundred-fifty years of culinary culture had been squandered. These were foods that people had once cared about deeply thatd been central to their lives. It felt wrong to just let them die. But if Marshall was right some of Appalachias heritage apples could still be recovered. And Brown was looking for a retirement hobby. His experience as a scientist would bring calculated organization to searches. The project would let him explore and learn more about the history of rural Appalachian communities. Brown realized hed stumbled onto what could only be described as a calling. Becoming the worlds most accomplished heirloom apple-hunter brought a steep learning curve. Marshall introduced Brown to a network of aging small-scale heritage orchardists (none kept more than 20 varieties) who taught him the basics of identifying cloning grafting and maintaining trees. He discussed lost apple varieties and made lists of names including characteristics former growing locations and rumors of where trees still existed. Connecting with regional historical societies yielded old orchard maps fruit-grower association newsletters and names of former owners and workers. Pomological historians helped Brown track down vintage orchard catalogs with drawings and descriptions for thousands of lost varieties. His early search-and-rescue attempts centered around former hotbeds of production such as North Carolinas Brushy Mountains. The two-county region was home to more than 100 commercial orchards in 1900. Brown advertised in area newspapers seeking information about old apple trees. The response was exciting but also kind of [a reality-check] says Brown. He fielded dozens of calls but few brought concrete information. Most callers were in their 80s and 90s says Brown and told childhood stories where old man such-and-such had a tree with 20 different types of apples grafted onto it. Up to then I hadnt grasped how much detective work [this] was going to require says Brown. Years of ad-hoc efforts helped him develop central strategies for hunts. First he gathers clues about trees possible whereabouts. For instance discovering the address of someones great-grandparents who once kept a large orchard can pinpoint a rural community where special trees may still exist. Brown then draws a radius around the property and canvasses nearby homes. He stops at local businesses to make inquiries. When I explain what Im doing most people are really receptive says Brown. For instance a conversation with an 80 year old at a country store in northeast Georgia led Brown to amateur orchardist Johnny Crawford. Crawford put Brown in touch with elders in the Speed family who ultimately helped him locate a treasure-trove of heirlooms in a rural area including the Royal Lemon Neverfail Candy Stripe and Black Winesap. When Brown finds a tree he takes clippings and returns during fruiting season to identify them. He compares leaves and apples to catalog entries and uses photos to correspond with experts for further verification. Brown drives about 30000-plus miles a year and devotes around three days a week to apple-hunting. His partnerships with municipalities and non-profits such as the Southern Foodways Alliance help establish reclaimed varieties at additional orchards and ensure their survival. Saving an apple from the brink of extinction is a miraculous feeling says Brown. Its incredibly rewardingand incredibly addictive! Today Browns orchard is filled with clones of trees recovered in Georgia South Carolina North Carolina Kentucky Tennessee Virginia West Virginia Maryland and Pennsylvania. He divides time between apple-hunting tending trees donating scionwood to nonprofit heritage orchards and selling about 1000 saplings annually. Browns work has been commended by conservationists and culinary professionals alike. Chefs like Travis Milton are stoked to have hundreds of new flavors to experiment with. Craft cidermakers say reintroduced heirlooms are inspiring a cider renaissance. Tom has helped redefine whats possible says Foggy Ridge Cider owner Diane Flynt who won a James Beard Foundation award in 2018. She says heirlooms such as Hewes Virginia Crab and Arkansas Black are for Appalachia what noble grape varieties like Merlot or Cabernet Sauvignon are for Bordeaux. Brown is thrilled the apples are being put to good use. But hes quick to note that many still need saving. And theyre getting harder to find. It takes me probably 20-30 times more work and a lot more driving to locate one new tree says Brown. But that doesnt deter him. Brown has come to think of restoring Appalachias heritage apples as his true lifes work. While he hopes to recover another 100 varieties or more in his lifetime experiencing just one more find would be reward enough. Gastro Obscura covers the worlds most wondrous food and drink. Sign up for our email delivered twice a week . We depend on ad revenue to craft and curate stories about the worlds hidden wonders. Consider supporting our work by becoming a member for as little as $5 a month. Follow us on Twitter to get the latest on the world's hidden wonders. Like us on Facebook to get the latest on the world's hidden wonders.
1,081
BAD
Apple 'created decoy labor group' to derail unionization (theregister.com)
1,101
BAD
Apple Atari and Commodore Explore a deluxe home vintage computer den (arstechnica.com) Front page layout Site theme Benj Edwards - Mar 11 2023 12:00 pm UTC In a world where millions of people carry a 1990s-grade supercomputer in their pockets it's fun to revisit tech from a time when a 1 megahertz machine on a desktop represented a significant leap forward. Recently a collector named Brian Green showed off his vintage computer collection on Twitter and we thought it would be fun to ask him about why and how he set up his at-home computer lab. By day Green works as a senior systems engineer based in Arkansas. But in his off hours Ice Breaker (as he's often known online) focuses his passion on a vintage computer collection that he has been building for decadesand a bulletin board system ( BBS ) called Particles he has been running since 1992. Green's interest in computers dates back to 1980 when he first used an Apple II+ at elementary school. My older sister brought home a printout from a BASIC program she was working on and I was fascinated that you could tell a computer what to do using something that resembled English recalls Green. Once I realized you could code games I was hooked. Despite his early encounters with the Apple II 1982's Commodore 64 truly won his heart. As his first computer with a disk drive it came at a dear price for a kid so he spent an entire summer saving money from his paper route to buy one. Most of my friends had one at the time he says. Today Green's vintage computer collection spans a wide range of machines with the rarest one being a Commodore B128-80 from 1982. As part of the failed Commodore B Series of computers the model barely made it out of the door before the plug was pulled according to Green. Of the B-Series this one is the most common with about 10000 made says Green. Whereas other models had as few as just a few hundred. We asked him which computer was the hardest to track down and he pointed to the ill-fated Apple III which Apple launched in 1980 as a business-capable follow-up of its more famous prequel: I probably hunted for an Apple III the longest. Most computers are obtainable if you're willing to spend the money on eBay but that's not as fun as picking something up at a show or a flea market. I found a working Apple III at the last Vintage Computer Festival Midwest for a good price and have it displayed proudly. From these pictures it's clear that Green's home computer lab is an exercise in weapons-grade tech nostalgia. His goal is to re-create the computing experience of the 1980s when he grew up reading magazines like Family Computing. Every month there was a new computer being announced or reviewed he says. I was a kid then and couldn't afford any of these computers but I was always fascinated by all the different hardware. I wanted to try them all! I try to use as much 'period correct' hardware as I can though there is a smattering of newer hardware in these machines too. Join the Ars Orbital Transmission mailing list to get weekly updates delivered to your inbox. Sign me up CNMN Collection WIRED Media Group 2023 Cond Nast. All rights reserved. Use of and/or registration on any portion of this site constitutes acceptance of our User Agreement (updated 1/1/20) and Privacy Policy and Cookie Statement (updated 1/1/20) and Ars Technica Addendum (effective 8/21/2018). Ars may earn compensation on sales from links on this site. Read our affiliate link policy . Your California Privacy Rights | Do Not Sell My Personal Information The material on this site may not be reproduced distributed transmitted cached or otherwise used except with the prior written permission of Cond Nast. Ad Choices
1,090
BAD
Apple Big Pink #3 (1990) [pdf] (uni-stuttgart.de)
1,093
BAD
Apple Books digital narration (authors.apple.com) More and more book lovers are listening to audiobooks yet only a fraction of books are converted to audio leaving millions of titles unheard. Many authors especially independent authors and those associated with small publishers arent able to create audiobooks due to the cost and complexity of production. Apple Books digital narration makes the creation of audiobooks more accessible to all helping you meet the growing demand by making more books available for listeners to enjoy. Apple Books digital narration brings together advanced speech synthesis technology with important work by teams of linguists quality control specialists and audio engineers to produce high-quality audiobooks from an ebook file. Apple has long been on the forefront of innovative speech technology and has now adapted it for long-form reading working alongside publishers authors and narrators. For information on how you can take advantage of this new technology see how to get started . Digitally narrated titles are a valuable complement to professionally narrated audiobooks and will help bring audio to as many books and as many people as possible. Apple Books remains committed to celebrating and showcasing the magic of human narration and will continue to grow the human-narrated audiobook catalog. Our digital voices are created and optimized for specific genres. Were starting with fiction and romance and are accepting ebook submissions in these genres. Hear samples of voices available for these genres below or check out the full books in our audiobooks store. Our nonfiction and self-development narration program is kicking off and will be available more widely in the future. You can work with our preferred partners to get updates on the timing of when these genres and others will be available for use in digital narration. Hear samples of our nonfiction and self-development voices below or check out the full books in our audiobooks store.
1,095
BAD
Apple Car: Bad Idea After All (mondaynote.com) Sign up Sign In Sign up Sign In Jean-Louis Gasse Follow Monday Note -- 19 Listen Share by Jean-Louis Gasse Who wouldnt want to drive a vehicle built by a company whos sense of fit and finish its attention to the users experience is second to none? A vehicle we could facilely call The iPhone of EVs although Apple Car is a powerful enough monicker. That was my conclusion in the August 21 Monday Note Apple Car: Software and Money . But theres another side to the story. As the sages insist we dont understand a problem an idea a case unless were able to see to plead both sides. So Ill attempt to argue that the Apple Car is a bad idea. In that Monday Note I asked the money question: Why would Apple with its 54% Gross Margin (more for services a little less for hardware) wade into an auto industry swap that has notoriously low Gross Margins around 7% worldwide with a little more for premium brands? However when we took a closer look at Teslas financial statements we found that the preeminent EV companys Gross Margin has fluctuated between 28% and 33% for the past five quarters . Surely an even more organized company such as Apple could do better and achieve its customary Gross Margin level in a $3T (as in trillion) industry . Furthermore whereas Tesla has to build its own factories Apple could operate in its usual Asset Light (and software-heavy) fashion. As the company does for all its products from iPods to iPhones and Macs subcontractors managed by Apple would build Apple Cars. But while the Asset Light business model (where the company focuses on reducing the amount of capital that is invested in assets) allows financial flexibility its not free theres no magic. Apple must provide financial support for the contractors who build their devices. If you have the time and inclination take a look at Apples Q2 FY (Fiscal Year) 2022 quarterly statement . In the always instructive Managements Discussion and Analysis of Financial Condition and Results of Operations section there is a subsection titled Manufacturing Purchase Obligations that details advance payments made to its manufacturing contractors. In Q2 FY 2022 the amount was a respectable $40.6B soberly labeled as primarily non-cancelable. Apple would surely do the same at the appropriate scale for a car manufacturing contractor. Then theres the price challenge. Today EVs cost $40K and up where up means $150K for a Porsche Taycan or $110K for a Mercedes EQS . The entry-level Tesla Model 3 starts at $46K for the two-wheel drive model and more than $60K for the four-wheel drive version to say nothing of the infamous Full Self-Driving package an additional $15K. (Ive often wondered if Elon Musks regrettable (and privately admitted) exaggerations will ever finally catch up with him.) But is this a challenge or an opportunity? Would Apple try to undercut existing EV makers by selling an Apple Car for significantly less than $50K? No history tells us that Apple would vie for a premium spot by trading on its reputation and top-grade UI. Ive tried several EVs (besides our own Tesla) and have found their UI lacking. I even watched from the rear seat of a German EV as a salesperson unsold my spouse confusing her with the many ways to accomplish a simple task. Assuming the price hurdle can be surmounted how would Apple sell its vehicles? You can walk into an Apple Store and walk out with a new Watch a new iPhone or a new Mac (or order it anyway delivery times may vary). Cars are different they wont fit in an Apple Store. Tesla had to build a network of stores where people could come in kick the tires and get a test-drive but not in in states that prohibit direct sales ( ten of them at last count). There are other areas where Teslas activities such as service and repair are limited. In addition to having to build a dedicated infrastructure to sell and service its vehicles Apple would have to find ways to overcome the same obstacles that stand in the way of Teslas direct sales. This hasnt prevented Musk from achieving close to 15% market share in California (versus 5.6% for the US at large) but we should remind ourselves that it took Tesla (founded in 2003) nearly two decades to achieve its current market acceptance. And what about software an area where Apple (nearly) always shines? An Apple Car would likely need a dedicated operating system rather than a version of macOS or iOS. Too many different requirements: power UI reliability (dont reboot in the middle of a freeway or downtown New York City) and then there are the demands made by full self-driving autonomy aka Level 5 Automation . My August 21 conclusion was that the Cupertino company would forgo the fantasy of Level 5 in order to take its share of the $3T worldwide auto market and gain a substantial revenue stream. But shorn of the need to achieve Level 5 the Apple Car OS would still have to achieve Level 2 if not Level 3 in order to keep pace with Mercedes recently announced Drive Pilot (obligingly pointed to by a sharp-eyed MN reader). Even for Apple software is hard. Just see the camera bugs in the thoroughly-tested iOS 16 : Earlier this week Apple confirmed it would roll out an update next week aimed at addressing a handful of bugs that have been plaguing users including an annoying copy and paste permissions-related bug and another that saw the camera shake when recording video in a number of third-party apps What happens when car software has a runtime glitch? Volkswagen CEO Herbert Diess lost his job because of EV software problems at the Cariad software operation. But this never happens at Apple! youll object. Tell that to the esteemed Scott Forstall or the Titan principals who have come and gone during the projects eight-year history a revolving door phenomenon that saps confidence in the companys ability to make the project converge towards a state of true productization. To be sure Apple is known for its patience and discipline. The company is immensely well-financed and wouldnt have to go through the travails of Teslas early years with a hat tip to Elon Musks fortitude and improvisation skills. But by the time Apple enters the EV market (in 2025?) there will be a crowd of competitors from Germany Korea China and the US itself where once skeptical domestic automakers are now fully committed and spending big sums to be fully in the game. Ultimately Im skepticaland that makes me uncomfortable. Personally its jarring to think that Ive joined the chorus of doomsayers who have repeatedly predicted Apples failure with the Mac the iPhone the iPadwhatever Apple comes up its going to crash and burn. But my honest view is that the Apple Car project could be a bad idea for reasons of price sales and service infrastructure and technical challenges. The pessimism is countered by my great respect for Tim Cook and his team. Theyve brought Apple to its leading market position and financial strength. As a result if Cooks team continues to pour an estimated $1B per year again for eight years into an EV project they must have done their sums and weighed the odds. So many billions invested against an uncertain but appetizing new revenue stream. A great win against an affordable loss if the EV planets dont align the Apple way. JLG@mondaynote.com -- -- 19 Monday Note Jean-Louis Gasse in Monday Note -- 8 Jean-Louis Gasse in Monday Note -- 8 Jean-Louis Gasse in Monday Note -- Jean-Louis Gasse in Monday Note -- 5 Clive Thompson -- 19 Stephen Moore -- 65 wordsmithwriter I Anshul -- 23 umair haque in Eudaimonia and Co -- 50 Enrique Dans in Enrique Dans -- 31 Adrien Book in Predict -- 11 Help Status Writers Blog Careers Privacy Terms About Text to speech
1,098
BAD
Apple Detection of Flashing Lights (github.com/apple) Use Git or checkout with SVN using the web URL. Work fast with our official CLI. Learn more about the CLI . Please sign in to use Codespaces. If nothing happens download GitHub Desktop and try again. If nothing happens download GitHub Desktop and try again. If nothing happens download Xcode and try again. Your codespace will open once ready. There was a problem preparing your codespace please try again. Purveyors of media content might want to detect the amount of flashing lights in video content in order to inform consumers of the media in which risk of flashing lights occurs. This project provides a reference implementation of the algorithm for calculating risk of flashing lights in video content and reducing the flashing according to Video flashing reduction technical summary ( EPUB ) ( PDF ): Important : This sample code project includes a sample video that contains sequences of flashing effects.
1,102
BAD
Apple Discontinues macOS Server (support.apple.com) As of April 21 2022 Apple has discontinued macOS Server.Existing macOS Server customers can continue to download and use the app with macOS Monterey. The most popular server featuresCaching Server File Sharing Server and Time Machine Serverare bundled with every installation of macOS High Sierra and later so that even more customers have access to these essential services at no extra cost. To help you migrate from Profile Manager to other MDM solutionslearn about choosing an MDM solution and planning your MDM migration . Learn about which services are available in macOS Server and which services have been migrated from macOS Server to macOS. The following services are included and fully supportedin macOS Server 5.12 and later: The following services are included and fully supportedin macOS Server 5.7.1 through 5.11.1: Command-line tools for Xsan management are included in macOS Big Sur and later. Learn more These services are now available in macOS High Sierra and later and can be found in System Preferences > Sharing: This table shows the status of each macOS Server service and available alternatives. dovecot / Postfix Courier KerioConnect Information about products not manufactured by Apple or independent websites not controlled or tested by Apple is provided without recommendation or endorsement. Apple assumes no responsibility with regard to the selection performance or use of third-party websites or products. Apple makes no representations regarding third-party website accuracy or reliability. Contact the vendor for additional information.
1,103
BAD
Apple Execs on iMessage for Android (2013) (twitter.com/techemails) Weve detected that JavaScript is disabled in this browser. Please enable JavaScript or switch to a supported browser to continue using twitter.com. You can see a list of supported browsers in our Help Center. Help Center Terms of Service Privacy Policy Cookie Policy Imprint Ads info 2023 X Corp.
1,106
BAD
Apple GPU drivers now in Asahi Linux (asahilinux.org) Hello everyone! Were excited to announce our first public Apple Silicon GPU driver release! Weve been working hard over the past two years to bring this new driver to everyone and were really proud to finally be here. This is still an alpha driver but its already good enough to run a smooth desktop experience and some games. Read on to find out more about the state of things today how to install it (its an opt-in package) and how to report bugs! This release features work-in-progress OpenGL 2.1 and OpenGL ES 2.0 support for all current Apple M-series systems. Thats enough for hardware acceleration with desktop environments like GNOME and KDE. Its also enough for older 3D games like Quake3 and Neverball. While theres always room for improvement the driver is fast enough to run all of the above at 60 frames per second at 4K. Please note: these drivers have not yet passed the OpenGL (ES) conformance tests. There will be bugs! Whats next? Supporting more applications. While OpenGL (ES) 2 suffices for some applications newer ones (especially games) demand more OpenGL features. OpenGL (ES) 3 brings with it a slew of new features like multiple render targets multisampling and transform feedback. Work on these features is well under way but they will each take a great deal of additional development effort and all are needed before OpenGL (ES) 3.0 is available. What about Vulkan? Were working on it! Although were only shipping OpenGL right now were designing with Vulkan in mind. Most of the work were putting toward OpenGL will be reused for Vulkan. We estimated that we could ship working OpenGL 2 drivers much sooner than a working Vulkan 1.0 driver and we wanted to get hardware accelerated desktops into your hands as soon as possible. For the most part those desktops use OpenGL so supporting OpenGL first made more sense to us than diving into the Vulkan deep end only to use Zink to translate OpenGL 2 to Vulkan to run desktops. Plus there is a large spectrum of OpenGL support with OpenGL 2.1 containing a fraction of the features of OpenGL 4.6. The same is true for Vulkan: the baseline Vulkan 1.0 profile is roughly equivalent to OpenGL ES 3.1 but applications these days want Vulkan 1.3 with tons of extensions and optional features. Zinks layering of OpenGL on top of Vulkan isnt magic: it can only expose the OpenGL features that the underlying Vulkan driver has. A baseline Vulkan 1.0 driver isnt even enough to get OpenGL 2.1 on Zink! Zink itself advertises support for OpenGL 4.6 but of course thats only when paired with Vulkan drivers that support the equivalent of OpenGL 4.6 and that gets us back to a tremendous amount of time and effort. When will OpenGL 3 support be ready? OpenGL 4? Vulkan 1.0? Vulkan 1.3? In community open source projects its said that every time somebody asks when a feature will be done it delays that feature by a month. Well a lot of people have been asking At any rate for a sneak peek here is SuperTuxKarts deferred renderer running at full speed making liberal use of OpenGL ES 3 features like multiple render targets~ Modern GPUs consist of many distinct layered parts. There is This layered hardware demands a layered graphics driver stack. We need Thats a lot of work calling for a team effort! Fortunately that layering gives us natural boundaries to divide work among our small team. Meanwhile Ella Stanforth is working on a Vulkan driver reusing the kernel driver the compiler and some code shared with the OpenGL driver. Of course we couldnt build an OpenGL driver in under two years just ourselves. Thanks to the power of free and open source software we stand on the shoulders of FOSS giants. The compiler implements a NIR backend where NIR is a powerful intermediate representation including GLSL to NIR translation. The kernel driver users the Direct Rendering Manager (DRM) subsystem of the Linux kernel to minimize boilerplate. Finally the OpenGL driver implements the Gallium3D API inside of Mesa the home for open source OpenGL and Vulkan drivers. Through Mesa and Gallium3D we benefit from thirty years of OpenGL driver development with common code translating OpenGL into the much simpler Gallium3D. Thanks to the incredible engineering of NIR Mesa and Gallium3D our ragtag team of reverse-engineers can focus on whats left: the Apple hardware. To get the new drivers you need to run the linux-asahi-edge kernel and also install the mesa-asahi-edge Mesa package. Since only one version of Mesa can be installed at a time pacman will prompt you to replace mesa with mesa-asahi-edge . This is normal! We also recommend running Wayland instead of Xorg at this point so if youre using the KDE Plasma environment make sure to install the Wayland session: Then reboot pick the Wayland session at the top of the login screen (SDDM) and enjoy! You might want to adjust the screen scale factor in System Settings Display and Monitor (Plasma Wayland defaults to 100% or 200% while 150% is often nicer). If you have Force font DPI enabled under Appearance Fonts you should disable that (it is saved separately for Wayland and Xorg and shouldnt be necessary on Wayland sessions). Log out and back in for these changes to fully apply. Xorg and Xorg-based desktop environments should work but there are a few known issues: The linux-asahi-edge kernel can be installed side-by-side with the standard linux-asahi package but both versions should be kept in sync so make sure to always update your packages together! You can always pick the linux-asahi kernel in the GRUB boot menu which will disable GPU acceleration and the DCP display driver. When the packages are updated in the future its possible that graphical apps will stop starting up after an update until you reboot or they may fall back to software rendering. This is normal. Until the UAPI is stable well have to break compatibility between Mesa and the kernel every now and then so you will need to reboot to make things work after updates. In general if apps do keep working with acceleration after any particular Mesa update then its probably safe not to reboot but you should still do it to make sure youre running the latest kernel! Since the driver is still in development there are lots of known issues and were still working hard on improving conformance test results. Please dont open new bugs for random apps not working! Its still the early days and we know theres a lot of work to do. Heres a quick guide of how to report bugs: We hope you enjoy our driver! Remember things are still moving quickly so make sure to update your packages regularly to get updates and bug fixes!
1,111
BAD
Apple Google and Microsoft Commit to Expanded Support for FIDO Standard (fidoalliance.org)
1,110
BAD
Apple Google and Microsoft commit to expanded support for FIDO standard (apple.com) Text of this article May 5 2022 PRESS RELEASE Apple Google and Microsoft commit to expanded support for FIDO standard to accelerate availability of passwordless signins Faster easier and more secure signins will be availableto consumers across leading devices and platforms Mountain View California In a joint effort to make the web more secure and usable for all Apple Google and Microsoft today announced plans to expand support for a common passwordless sign-in standard created by the FIDO Alliance and the World Wide Web Consortium. The new capability will allow websites and apps to offer consistent secure and easy passwordless sign-ins to consumers across devices and platforms. Password-only authentication is one of the biggest security problems on the web and managing so many passwords is cumbersome for consumers which often leads consumers to reuse the same ones across services. This practice can lead to costly account takeovers data breaches and even stolen identities. While password managers and legacy forms of two-factor authentication offer incremental improvements there has been industry-wide collaboration to create sign-in technology that is more convenient and more secure. The expanded standards-based capabilities will give websites and apps the ability to offer an end-to-end passwordless option. Users will sign in through the same action that they take multiple times each day to unlock their devices such as a simple verification of their fingerprint or face or a device PIN. This new approach protects against phishing and sign-in will be radically more secure when compared to passwords and legacy multi-factor technologies such as one-time passcodes sent over SMS. An Expansion of Passwordless Standard Support Hundreds of technology companies and service providers from around the world worked within the FIDO Alliance and W3C to create the passwordless sign-in standards that are already supported in billions of devices and all modern web browsers. Apple Google and Microsoft have led development of this expanded set of capabilities and are now building support into their respective platforms. These companies platforms already support FIDO Alliance standards to enable passwordless sign-in on billions of industry-leading devices but previous implementations require users to sign in to each website or app with each device before they can use passwordless functionality. Todays announcement extends these platform implementations to give users two new capabilities for more seamless and secure passwordless sign-ins: In addition to facilitating a better user experience the broad support of this standards-based approach will enable service providers to offer FIDO credentials without needing passwords as an alternative sign-in or account recovery method. These new capabilities are expected to become available across Apple Google and Microsoft platforms over the course of the coming year. Simpler stronger authentication is not just FIDO Alliances tagline it also has been a guiding principle for our specifications and deployment guidelines. Ubiquity and usability are critical to seeing multi-factor authentication adopted at scale and we applaud Apple Google and Microsoft for helping make this objective a reality by committing to support this user-friendly innovation in their platforms and products said Andrew Shikiar executive director and CMO of the FIDO Alliance. This new capability stands to usher in a new wave of low-friction FIDO implementations alongside the ongoing and growing utilization of security keys giving service providers a full range of options for deploying modern phishing-resistant authentication. The standards developed by the FIDO Alliance and World Wide Web Consortium and being led in practice by these innovative companies is the type of forward-leaning thinking that will ultimately keep the American people safer online. I applaud the commitment of our private sector partners to open standards that add flexibility for the service providers and a better user experience for customers said Jen Easterly Director of the U.S. Cybersecurity and Infrastructure Security Agency. At CISA we are working to raise the cybersecurity baseline for all Americans. Today is an important milestone in the security journey to encourage built-in security best practices and help us move beyond passwords. Cyber is a team sport and were pleased to continue our collaboration. Just as we design our products to be intuitive and capable we also design them to be private and secure said Kurt Knight Apples Senior Director of Platform Product Marketing. Working with the industry to establish new more secure sign-in methods that offer better protection and eliminate the vulnerabilities of passwords is central to our commitment to building products that offer maximum security and a transparent user experience all with the goal of keeping users personal information safe. This milestone is a testament to the collaborative work being done across the industry to increase protection and eliminate outdated password-based authentication said Mark Risher Senior Director of Product Management Google. For Google it represents nearly a decade of work weve done alongside FIDO as part of our continued innovation towards a passwordless future. We look forward to making FIDO-based technology available across Chrome ChromeOS Android and other platforms and encourage app and website developers to adopt it so people around the world can safely move away from the risk and hassle of passwords. The complete shift to a passwordless world will begin with consumers making it a natural part of their lives. Any viable solution must be safer easier and faster than the passwords and legacy multi-factor authentication methods used today says Alex Simons Corporate Vice President Identity Program Management at Microsoft. By working together as a community across platforms we can at last achieve this vision and make significant progress toward eliminating passwords. We see a bright future for FIDO-based credentials in both consumer and enterprise scenarios and will continue to build support across Microsoft apps and services. Press Contacts Apple Media Helpline media.help@apple.com About the FIDO Alliance The FIDO (Fast IDentity Online) Alliance www.fidoalliance.org was formed in July 2012 to address the lack of interoperability among strong authentication technologies and remedy the problems users face with creating and remembering multiple usernames and passwords. The FIDO Alliance is changing the nature of authentication with standards for simpler stronger authentication that define an open scalable interoperable set of mechanisms that reduce reliance on passwords. FIDO Authentication is stronger private and easier to use when authenticating to online services. About Apple Apple revolutionized personal technology with the introduction of the Macintosh in 1984. Today Apple leads the world in innovation with iPhone iPad Mac Apple Watch and Apple TV. Apples five software platforms iOS iPadOS macOS watchOS and tvOS provide seamless experiences across all Apple devices and empower people with breakthrough services including the App Store Apple Music Apple Pay and iCloud. Apples more than 100000 employees are dedicated to making the best products on earth and to leaving the world better than we found it. About Google Googles mission is to organize the worlds information and make it universally accessible and useful. Through products and platforms like Search Maps Gmail Android Google Play Google Cloud Chrome and YouTube Google plays a meaningful role in the daily lives of billions of people and has become one of the most widely-known companies in the world. Google is a subsidiary of Alphabet Inc. About Microsoft Microsoft (Nasdaq MSFT @microsoft) enables digital transformation for the era of an intelligent cloud and an intelligent edge. Its mission is to empower every person and every organization on the planet to achieve more. Apple Media Helpline media.help@apple.com The latest news and updates direct from Apple. Read more
1,109
BAD
Apple II Desktop Updated (callapple.org) The Apple II Desktop has been updated. Version 1.2 Alpha 48 includes a ton of updates. The program is now available in eight languages. Updated features include: You can download the program and the source code from the Apple II Desktop Github page at: https://github.com/a2stuff/a2d More information and documentation is available on the main Apple II Desktop page is at: https://www.a2desktop.com You can try out the Apple II Desktop in your browser by clicking here Join A.P.P.L.E. Choose the amount to pay for your annual membership to the Apple Pugetsound Program Library Exchange (A.P.P.L.E.) User Group. Upper-level memberships are donation levels that help keep the server and lights on and bring you great things like our 50+ Page PDF Magazine! Read about Member Benefits. Like what you see and just want to help? Consider a donation by scanning the QR Code or clicking Donate. Copyright 1978 - 2023 Apple Pugetsound Program Library Exchange (A.P.P.L.E.). All Rights Reserved.
1,114
BAD
Apple Interview 1995 (engineersneedart.com) In the 1990s I had access to the early internet (before the web) and made use of a public forum ( Usenet ) where I could ask and respond to questions about programming the Macintosh computer. Ive mentioned before how this caught the attention of engineering at Apple ( Tom Dowdy in any event). Perhaps too the game I had written for the Macintosh Glider had afforded me some name recognition. Glider was a computer game I wrote originally as shareware. I wrote it while I was in college and trying to teach myself how to program the Macintosh. A few years later I reached out to a software publisher in California and they agreed to publish my game commercially (for publication I rewrote Glider in color and added a lot of other niceties to take it to the next level so to speak). The games editor for MacWorld magazine chose Glider as the best Macintosh arcade game of 1991 so it was perhaps unsurprising that a few of the engineers on the graphics programming team at Apple knew about the game. When I had started to make money from royalties on Glider I bought the least expensive laser printer from Apple and proceeded to print the entire source code for my game. There was a print shop in Lawrence Kansas that had some means of binding thick print jobs like mine hardbound and everything. So for a little bit more of my royalty money I had the printed sources for Glider bound up. The thing looked like a volume from a big black encylopedia set. I should point out the reason I printed out the sources to Glider were two-fold. One it was a sort of ultimate backup during a time when floppy disks and even hard drives were untrustworthy. I would have hated to do it but in a week or two I probably could re-type all the code in by hand using this book. But the other reason I printed it out was as some sort of token or perhaps as a testament to this thing I had just accomplished. The end product of months and months of programming is in the end a stream of bytes that exists invisibly on some sort of magnetic media (in those days). Like a novelist having written a book printing the code out provided the coding effort more of a sense of gravity (literally ha ha). Additionally since programming can go on and on (a program is never truly done) printing the thing also gave a kind of finality to the project. As I flew out to Cupertino California for the job interview I thought to grab my Glider book and bring it along as well like Heres an example of my programming. I found out though that bringing in a bound book of your source code to the interview is rather unusual. When I sat down in a small conference room across from two Apple engineers and put my Glider book between us there were smiles and looks of surprise from the interviewers. That it turned out to be a conversation starter was a happy accident. From my vantage point in the present having worked at Apple for over twenty-five years having interviewed a couple dozen candidates myself I see now how rather eccentric of a thing that was. And perhaps in no small part to this bit of ice-breaking I felt relaxed to talk very casually to my interlocutors. I should pause here and point out that an interview at Apple was an all-day affair. Pairs of engineers would meet with and interview me for perhaps an hour. The first pair of engineers might grill me on some esoteric topic like code design afterwards the next pair of engineers would sit down and maybe cover programming language specific questions. And so it was with each round of interviews I heard more than once from the engineering pair interviewing me how hard an interviewer Andy was and how I was going to enjoy (ha ha) my coming interview with him. Andy (and this is not his real name) would be one of the last interviews just him no partner and I understood this was to be the real no-nonsense interview. Up to that point in the day I had found the interviews to be pretty relaxed and my confidence that at least the team liked me was pretty good. I dont recall having stumbled much during the Q&A but no doubt I showed some level of juniorness but perhaps they would allow for that. This upcoming interview with Andy though was causing me some anxiousness. When Andy sits down across from me it is clear from his natural scowl that he was all business. Im sure my Glider book was still there on the table but I dont think he showed any interest. To break the ice I offered So I am a little nervous about this interview with you. And why is that? he replied deadpan. Oh I guess I heard from some of the other engineers that you can be a tough interviewer. Why would they say that? Again with quick question right back. I began to suspect that he was trying to give me enough rope to hang myself. And to be sure I was squirming a little. I dont know I said maybe you are going to ask me a question about Information Theory entropy something like that and I dont have a Computer Science degree so So tell me about Information Theory entropy. Now I bet Andy thought he had cornered me but I am going to confess that I had been the one laying the trap. I actually had read a little about entropy with regard to programming (Information Theory) in Scientific American magazine or somewhere and it had fascinated me. So while establishing my ignorance of lofty topics like Information Theory I had planted the one thing I could at least talk in laymans terms about. And so I did. As I said though I came about it from a laymans point of view and I never pretended to know any more than that with Andy. But as it turned out Andy was happy to talk about the topic and the rest of the interview began to feel more like I was attending a lecture on Information Theory and Andy was the lecturer. In the end I have no idea how all of my efforts that day played out. The engineers that interviewed me would meet at the close of the day or sometime during the next day and would each give a thumbs up or thumbs down . Democratically if I got more up than down votes I was in. It goes without I saying I passed the test as I was offered the position some weeks later. But Ill never know if my sparring with Andy won him over or if instead I had enough thumbs up from the other engineers that were charmed with my Glider book. In the end I think Apple got an engineer for the next twenty-five years that though not the cleverest engineer was one that worked quickly to prototype new ideas and took on some of the gruntwork that not every engineer wanted to work on. Maybe I can recall some of those times and blog about them in the future.
1,116
BAD
Apple Lightning (2020) (nyansatan.github.io) tristar -p astrisctl setenv debug 3 Example of Tristar CRAM exchange #appleinternal pic.twitter.com/x9ffWWTlAb tristar --prov_stat tristar --esn
1,125
BAD
Apple M1 Ultra (apple.com) Text of this article March 8 2022 PRESS RELEASE Apple unveils M1 Ultra the worlds most powerful chip for a personal computer Available in the all-new Mac Studio M1 Ultra brings unprecedented performance to the desktop Cupertino California Apple today announced M1 Ultra the next giant leap for Apple silicon and the Mac. Featuring UltraFusion Apples innovative packaging architecture that interconnects the die of two M1 Max chips to create a system on a chip (SoC) with unprecedented levels of performance and capabilities M1 Ultra delivers breathtaking computing power to the new Mac Studio while maintaining industry-leading performance per watt. The new SoC consists of 114 billion transistors the most ever in a personal computer chip. M1 Ultra can be configured with up to 128GB of high-bandwidth low-latency unified memory that can be accessed by the 20-core CPU 64-core GPU and 32-core Neural Engine providing astonishing performance for developers compiling code artists working in huge 3D environments that were previously impossible to render and video professionals who can transcode video to ProRes up to 5.6x faster than with a 28-core Mac Pro with Afterburner. 1 M1 Ultra is another game-changer for Apple silicon that once again will shock the PC industry. By connecting two M1 Max die with our UltraFusion packaging architecture were able to scale Apple silicon to unprecedented new heights said Johny Srouji Apples senior vice president of Hardware Technologies. With its powerful CPU massive GPU incredible Neural Engine ProRes hardware acceleration and huge amount of unified memory M1 Ultra completes the M1 family as the worlds most powerful and capable chip for a personal computer. Groundbreaking UltraFusion Architecture The foundation for M1 Ultra is the extremely powerful and power-efficient M1 Max. To build M1 Ultra the die of two M1 Max are connected using UltraFusion Apples custom-built packaging architecture. The most common way to scale performance is to connect two chips through a motherboard which typically brings significant trade-offs including increased latency reduced bandwidth and increased power consumption. However Apples innovative UltraFusion uses a silicon interposer that connects the chips across more than 10000 signals providing a massive 2.5TB/s of low latency inter-processor bandwidth more than 4x the bandwidth of the leading multi-chip interconnect technology. This enables M1 Ultra to behave and be recognized by software as one chip so developers dont need to rewrite code to take advantage of its performance. Theres never been anything like it. Unprecedented Performance and Power Efficiency M1 Ultra features an extraordinarily powerful 20-core CPU with 16 high-performance cores and four high-efficiency cores.It delivers 90 percent higher multi-threaded performance than the fastest available 16-core PC desktop chip in the same power envelope. Additionally M1 Ultra reaches the PC chips peak performance using 100 fewer watts. 2 That astounding efficiency means less energy is consumed and fans run quietly even as apps like Logic Pro rip through demanding workflows such as processing massive amounts of virtual instruments audio plug-ins and effects. For the most graphics-intensive needs like 3D rendering and complex image processing M1 Ultra has a 64-core GPU 8x the size of M1 delivering faster performance than even the highest-end PC GPU available while using 200 fewer watts of power. 3 Apples unified memory architecture has also scaled up with M1 Ultra. Memory bandwidth is increased to 800GB/s more than 10x the latest PC desktop chip and M1 Ultra can be configured with 128GB of unified memory. Compared with the most powerful PC graphics cards that max out at 48GB nothing comes close to M1 Ultra for graphics memory to support enormous GPU-intensive workloads like working with extreme 3D geometry and rendering massive scenes. The 32-core Neural Engine in M1 Ultra runs up to 22 trillion operations per second speeding through the most challenging machine learning tasks.And with double the media engine capabilities of M1 Max M1 Ultra offers unprecedented ProRes video encode and decode throughput. In fact the new Mac Studio with M1 Ultra can play back up to 18 streams of 8K ProRes 422 video a feat no other chip can accomplish. 4 M1 Ultra also integrates custom Apple technologies such as a display engine capable of driving multiple external displays integrated Thunderbolt 4 controllers and best-in-class security including Apples latest Secure Enclave hardware-verified secure boot and runtime anti-exploitation technologies. macOS and Apps Scale Up to M1 Ultra Deep integration between hardware and software has always been at the heart of the Mac experience. macOS Monterey has been designed for Apple silicon taking advantage of M1 Ultras huge increases in CPU GPU and memory bandwidth. Developer technologies like Metal let apps take full advantage of the new chip and optimizations in Core ML utilize the new 32-core Neural Engine so machine learning models run faster than ever. Users have access to the largest collection of apps ever for Mac including iPhone and iPad apps that can now run on Mac and Universal apps that unlock the full power of the M1 family of chips. Apps that have not yet been updated to Universal will run seamlessly with Apples Rosetta 2 technology. Another Leap Forward in the Transition to Apple Silicon Apple has introduced Apple silicon to nearly every Mac in the current lineup and each new chip M1 M1 Pro M1 Max and now M1 Ultra unleashes amazing capabilities for the Mac. M1 Ultra completes the M1 family of chips powering the all-new Mac Studio a high-performance desktop system with a reimagined compact design made possible by the industry-leading performance per watt of Apple silicon. Apple Silicon and the Environment The energy efficiency of Apples custom silicon helps Mac Studio use less power over its lifetime. In fact while delivering extraordinary performance Mac Studio consumes up to 1000 kilowatt-hours less energy than that of a high-end PC desktop over the course of a year. 5 Today Apple is carbon neutral for global corporate operations and by 2030 plans to have net-zero climate impact across the entire business which includes manufacturing supply chains and all product life cycles. This means that every chip Apple creates from design to manufacturing will be 100 percent carbon neutral. About Apple Apple revolutionized personal technology with the introduction of the Macintosh in 1984. Today Apple leads the world in innovation with iPhone iPad Mac Apple Watch and Apple TV. Apples five software platforms iOS iPadOS macOS watchOS and tvOS provide seamless experiences across all Apple devices and empower people with breakthrough services including the App Store Apple Music Apple Pay and iCloud. Apples more than 100000 employees are dedicated to making the best products on earth and to leaving the world better than we found it. Press Contacts Apple Media Helpline media.help@apple.com Images in this article Apple revolutionized personal technology with the introduction of the Macintosh in 1984. Today Apple leads the world in innovation with iPhone iPad Mac Apple Watch and Apple TV. Apples five software platforms iOS iPadOS macOS watchOS and tvOS provide seamless experiences across all Apple devices and empower people with breakthrough services including the App Store Apple Music Apple Pay and iCloud. Apples more than 100000 employees are dedicated to making the best products on earth and to leaving the world better than we found it. Apple Media Helpline media.help@apple.com The latest news and updates direct from Apple. Read more
1,126
BAD
Apple M2 Die Shot and Architecture Analysis Big Cost Increase and A15 Based IP (semianalysis.substack.com) Apple announced their new 20 billion transistor M2 SoC at WWDC. Unfortunately its quite a minor uplift in performance in some areas such as CPU. Apples gains mostly came from the GPU and video editing side of things. The overall performance gains are quite disappointing when you factor in the raw cost increase that comes with this new M2 and the fact that it has been nearly 2 years since the M1s introduction. The cost increase story is similar to that which we wrote about on the A16 where Apple is being forced to diverge SoC choices to A16 on Pro iPhone models and A15 on normal iPhone models due to bill of materials concerns . Today we will be discussing details related to the architecture of M2 and Apples future designs including M2 Pro/Max and M3 that werent discussed at WWDC. We will also do a die area analysis of the images of M2 that Apple released with the help of Locuza. If you prefer to listen to this rather than read we have created a YouTube video . Its very odd that we have seen some pundits talk about this being M1.5 or M1+. That is just nonsense. M1 was generally based on the same IP blocks as the Apple A14 outside of a few deviations. M2 codenamed Staten is generally based on the same IP blocks as A15 codenamed Ellis. The codenames being based on some of New Yorks most well-known islands which should be a hint to how closely related these architectures are. A lot of the disappointment in performance uplift comes from weak gen-on-gen gains given the nearly 2 yearlong gap versus M1. Many people expected more out of the M2. We discussed this in the past but a lot of the slow down stems from Apple losing leagues of amazing engineers to firms such as Nuvia and Rivos . The bleeding hasnt stopped in recent years as Apples work culture simply isnt the best and other firms namely the hyperscalers such as Google Microsoft Amazon and Meta are paying more than Apple was to poach talent. Lastly there has also been an exodus of non-money motivated engineers who think they were successful in transitioning Apple off of Intel silicon and onto the in-house silicon. These engineers have also left to work on what they believe are more interesting projects elsewhere in the industry whether it is the hyperscalers or traditional firms. SemiAnalysis is a reader-supported publication. To receive new posts and support our work consider becoming a free or paid subscriber. These departures have culminated in A15 and M2 as well as potentially the upcoming A16 to deliver more tepid CPU gains. We have heard that the A16 will not utilize the next generation Armv9 based core which is quite sad if true given Apple was the first to implement Armv8. We have heard this next generation Armv9 core will only come in M3 which will be Apples first product on the TSMCs N3 node. Apple has already designed and taped out the M2 Pro and M2 Max which are still on N5 and A15 base IP. Lets dive into the die shot. Apple presented an un-marked image of both M1 and M2. This showcased that M2 was 141.7mm2 but we believe Apple modified the die image. This wouldnt be the first time. Apple did the same with the M1 Max where they hid the die-to-die connection used in the M1 Ultra. They also shifted the size dimensions. Apples image seems to be off scale with M2 as well. One could make out SRAM cells and PHYs which should be identical across the chips and see that M2 seems smaller than it should be. Apples presented M2 also appears to have higher transistor densities than the A15 which also is out of whack. It would have lower density due to the smaller total area dedicated to high density SRAM cells and higher total area dedicated to IO and other logic. For that reason Locuza scaled the M2 die. This scaling brought SRAM cells and identical PHYs equivalent to what Apple has on the M1 and A15. The funkiness of Apples marketing image does mean there is an error window of about 3% after the die was scaled in size. The numbers are presented as measured despite the error bars. Lets move onto how Apple spent the increase in die size. First lets start with Apples P-Core. It is based on Apples Avalanche core which showed up in A15 although there are some minor differences. This follows how M1 Pro and M1 Max also had a modified Firestorm that could handle larger memory sizes by implementing a larger PA. The M based cores also have a few modifications which help with varied page sizes that must be supported in MacOS. The core itself is 21% larger than in M1 and 7% larger than A15. The big area of gen-on-gen growth is with the shared L2 cache which has gone from 12MB to 16MB compared to both M1 and A15. The AMX unit looks identical across A15 and M1 as well. The shared logic plane is also significantly larger which is is an indication that there is more bandwidth between the cores and the L2 cache and SLC. Overall Apple spent 5.2mm2 on the big P-Cores but the performance increase from them comes mostly from clock speeds. The IPC increase is quite small as documented in reviews . SemiAnalysis is a reader-supported publication. To receive new posts and support our work consider becoming a free or paid subscriber. One very interesting change is that the ROB appears smaller in the Avalanche core that is found in A15 and M2 versus the Firestorm core found in M1 and A14. This is especially interesting because Apple has the largest ROB in the industry in order to achieve the widest and highest IPC core in the industry. The E-Core was the main unit of change from a CPU perspective from the A14 to A15 and that holds true here. After scaling the Apple provided die shot the E-Cores look nearly identical between the A15 and M2 which is a good sign that the scaling was accurate. Apple does less or no modification for the E-Cores for Mac silicon where as they do some tweaking and different physical design for those changes on the P-Cores. There isnt much to say here about the E-Core as it is pretty clearly the same as the A15 which has been extensively tested . The E-Core complex as a whole is only 1mm2 larger gen on gen while the entire CPU complex is 6.2mm2 larger. The GPU after scaling also appears to be nearly the same size per core versus the M1 at 128 ALU. This is very interesting because it is one of the areas that M1 diverged from the A14. There was an architecture change there even though they are in the same generation. Apple has precedent for the X SoC to have changes versus the A SoC. A6 and A6X had different GPU architectures many years ago for example. Given the M series SoC's are just a rebrand of the X series this tracked. This generation the GPU core itself seems unchanged but the shared logic and miscellaneous is larger so there could be changes on some fixed function aspects. The core count is the major change where Apple bumped it up to a 10 core GPU. We can exclusively detail the GPU clock speed which went up from 1.27 GHz to 1.406 GHz In total the new GPU adds nearly 7mm2 gen on gen. This is a worthwhile performance increase although Apple indicated that power consumption was up slightly at the maximum performance level. At the same power level Apple still gets nice performance increases though due to better memory and an overall wider/slower design. We also included the NPU and SLC figures here. The NPU figures come across as a little odd so we will skip over those. The SLC is where things are interesting. Each 2MB Data Array is generally the same size across M1 A15 and the scaled M2 die which makes sense and justifies our scaling based on identical PHY size. There is no SRAM shrink from the 1st generation to 2nd generation N5 process node. Despite this the SLC does grow somewhat in size on M2 likely to have more BW to the various IP blocks such as the larger GPU. The last IP block to compare is the memory controller + PHY. Apple increased the area here significantly to support LPDDR5 6400. The image above is for 1 unit but the memory controller is multiple channels of course. The total area dedicated to the 128-bit LP5 bus is about 14mm2 versus 8.1mm2 on M1 with its 128-bit LP4X and 4.3mm2 on A15 with its 64-bit LP4X. The real kicker on a cost perspective is fact that LPDDR5 6400 is significantly more expensive than LPDDR4X 4266. This is also a big part of the equation for Apple doing a split A15 A16 lineup on this years upcoming iPhones. We wrote about that decision here. Overall Apple has to contend with similar issues on the M2 which is why they are keeping M1 based models around for the low end. The combo of minor wafer price increases larger dies from 118.91mm2 to 155.25mm2 and more expensive memory hurts a lot. The last IP block which we did not explicitly measure is the much larger media engine to enable the enhanced media capabilities. Apples M series is by far the best silicon for creative professionals. There is no question here. If you work on the Adobe suite M series silicon is the best. If you liked this work consider subscribing to the free or paid newsletter. Also please head over to Locuzas Twitter Patreon and YouTube to support his work. Share SemiAnalysis is a reader-supported publication. To receive new posts and support my work consider becoming a free or paid subscriber. Dylan just checking if you plan to comment on die size transistor density etc of M2 Pro and M2 Max Dylan just checking if you plan to comment on die size transistor density etc of M2 Pro and M2 Max No posts Ready for more?
1,127
BAD
Apple M2 Pro to use new 3nm process (cultofmac.com) By Ed Hardy 9:04 am August 21 2022 Multiple sources say the Apples upcoming M2 Pro chip will be made with an enhanced manufacturing process that wasnt yet available for the recently released M2. That will make Apples next processor more efficient than the one in the new MacBook Air. Even better the M2 Pro could launch in updated MacBook Pro models before the end of 2022. Apple designs Mac processors but TSMC produces them. And the Taiwanese chipmaker has generally been able to shrink the components of Apples chips every year or two allowing them to work faster while generating less heat. That didnt happen with the M2 though so the new chip was made with the same 5nm process as the M1 series from 2020. But Taiwans Commercial Times reports that the M2 Pro will be made with a 3nm process. That will make the chip more energy-efficient.And the processor is expected to go into production before the end of 2022.A separate source said essentially the same in June. And TSMC wont stop with M2 Pro. Application processors as well as M2- and M3-series processors will be imported into TSMCs 3nm process reports the Commercial Times according to a machine translation. That apparently includes the M2 Max and M2 Ultra. But nothing is guaranteed. Analyst Ming-Chi Kuo said August 22 via Twitter Given TSMCs guidance that the 3nm will contribute revenue starting in 1H23 processors of 14 and 16 MacBook Pro models may still adopt the 5nm advanced node. There are a range of Apple Mac processors for a variety of devices of course. The basic M2 went into the consumer-oriented 2022 MacBook Air . The M2 Pro and Max are for more powerful models. 2021s M1 Pro and Max are scaled up versions of the original M1 with more CPU and GPU cores. They are the heart of the 14- and 16-inch MacBook Pro top-tier macOS notebooks for professionals. And theyre reportedly about to get better. Upgraded versions with the M2 Pro and Max are coming as early as autumn 2022 .They will be nearly the same models first released in 2021 but with the newer processors. And if the chips are made with TSMCs 3nm process they will be more energy efficient than if theyd been made with the original M2s 5nm one. Even better the late-2022 MacBook Pros are just the start. Apple is supposedly testing a Mac Pro running the M2 Ultra . Plus the A17 chip going into the 2023 iPhone 15 will reportedly be made with the 3nm process. The article was updated August 22 with the comment from Ming-Chi Kuo from TF International Securities. Daily round-ups or a weekly refresher straight from Cult of Mac to your inbox. Our daily roundup of Apple news reviews and how-tos. Plus the best Apple tweets fun polls and inspiring Steve Jobs bons mots. Our readers say: Love what you do -- Christi Cardenas. Absolutely love the content! -- Harshita Arora. The week's best Apple news reviews and how-tos from Cult of Mac every Saturday morning. Our readers say: Thank you guys for always posting cool stuff -- Vaughn Nevins. Very informative -- Kenly Xavier.
1,128
BAD
Apple Mac Studio (apple.com) MacStudio is an entirely new Mac desktop. Itpacks outrageous performance extensive connectivity and new capabilities into an unbelievably compact form putting everything you need within easy reach and transforming any space into a studio. And it all starts with your choice of the ferociously fast M1Max or the all-new M1Ultra the most powerful chip ever in a personal computer. Apple silicon has transformed our Mac lineup bringing unbelievable performance and power efficiency with its breakthrough system on a chip (SoC) architecture. Combining the CPU GPU I/O NeuralEngine and more into a single SoC with unified memory this architecture means all Mac computers are way faster cooler and more power efficient. To complete the M1 family we took another giant leap with Apple silicon to bring even more extreme levels of performance for the desktop. The challenge is that there are physical limits to creating a larger die than M1Max. EnterUltraFusion. The incredible M1Ultra chip actually starts with M1Max. From its inception M1Max has held the ability to connect to another M1Max die using a custom-built packaging architecture called UltraFusion. With twice the connection density of any technology available UltraFusion provides a massive 2.5TB/s of low-latency interprocessor bandwidth between the two dies using very little power. The final member of the M1 family M1Ultra joins two M1Max dies yet it looks like a single piece of silicon to software so apps benefit from its extraordinary capabilities without requiring any additional work from developers. The result is the most powerful chip ever in a personal computer. M1 Max Up to 64 GB unified memory Up to 400 GB/s memory bandwidth M1 Ultra Up to 128 GB unified memory Up to 800 GB/s memory bandwidth 10-core CPU Up to 32-core GPU 16-core NeuralEngine 20-core CPU Up to 64-core GPU 32-core NeuralEngine M1Max CPU Performance vs. Power M1Max delivers similar multithreaded CPU performance to the latest 10-core PC desktop chip while using 65 percent less power. 1 65% less power M1Ultra CPU Performance vs. Power Compared with the fastest 16-core PC desktop chip M1Ultra delivers up to 90 percent higher performance in the same power envelope. And M1Ultra can deliver the PC chips peak performance while using 100 watts less power. 1 90% higher performance 100 W less power M1Max GPU Performance vs. Power At its peak M1Max delivers similar performance to one of the most popular discrete GPUs while using one-third the power. 2 68% less power M1Ultra GPU Performance vs. Power At its peak M1Ultra delivers faster performance than the highest-end GPU available while using 200 watts less power. 2 200 W less power M1 Max brings power to tackle challenges of almost any size. Whether youre running multiple apps sorting and editing thousands of photos recording and mixing professional-quality music or discovering a new exoplanet the screaming-fast M1 Max has your back. Built from two M1 Max chips M1Ultra lets you power through workflows on an unprecedented scale. So you can run complex particle simulations or work with massive 3D environments that were previously impossible to render. And with twice the media engine resources M1Ultra can support up to 18 streams of 8K ProRes 422 video playback something no other personal computer cando. The incredible power and efficiency of Apple silicon has allowed us to completely reimagine the high-performance desktop. The design inspiration for MacStudio was simple: to let M1Max or M1Ultra and your creativity run completely unrestrained within the smallest possible footprint. It fits perfectly under most displays putting prodigious power and connectivity at arms reach right on yourdesk. Within the 7.7-inch-square MacStudio enclosure lies a groundbreaking thermal system designed to let M1Max or M1Ultra fly through heavy workloads at screaming-fast speeds and stay whisper quiet. MacStudio lets you create the studio of your dreams with an array of 12 high-performance ports located front and back for convenient access. Easily connect and disconnect your camera external storage and other devices from the front. See the bigger picture by plugging in up to four ProDisplayXDRs and one 4KTV. Theres also a headphone jack with advanced support for high-impedance headphones or line out for amplifiedspeakers. We designed Mac Studio together with the new StudioDisplay for the ultimate studiosetup. With its breathtaking 27-inch5K Retinadisplay 12MP Ultra Wide camerawithCenter Stage studio-quality three-mic arrayand six-speaker sound system with Spatial Audio Studio Display completes the full Macexperience. Learn more about StudioDisplay Transform your studio. With its strikingly compact form factor and abundance of ports MacStudio lets you reimagine your workspace and unleash yourcreativity. Connect up to four ProDisplayXDRs and one 4KTV. Magic accessories are wireless rechargeable precise comfortable and now come in a silver-and-black option to complete your setup. macOS Ventura was designed to scale with Apple silicon so the system automatically benefits from the increased graphics massive memory and powerful machine learning of M1Max or M1Ultra. With over 15000 optimized apps and plug-ins and features like Universal Control Shortcuts and Focus you can work play and create in ways you neverimagined. Open this page using Safari on your iPhone or iPad. View Mac Studio in AR View the complete Studio setup inAR MacStudio was engineered to minimize its environmental impact from how its built to how its packaged and shipped even how its used. Thanks to the efficiency of Apple silicon MacStudio raises the bar for desktops in its class by requiring far less energy than its competitors. Over the course of a year MacStudio will use up to 1000 kilowatt-hours less energy than a high-end PC desktop. It uses 100 percent recycled rare earth elements in its magnets and recycled tin in the solder of the main logic board. All virgin wood fiber in its packaging comes from responsibly managed forests. 19 Learn more about Apple and theenvironment Shop Learn more Just trade in your eligible computer for credit or recycle it for free. Itsgood for you and theplanet. 20 Find your tradein value And pay over time interest-free when you choose to check out with AppleCard MonthlyInstallments. Learn more Get the power to take your business to the nextlevel. Learn more Empowering educators and students to move the worldforward. Learn more You can pay over time when you choose to check out with AppleCard Monthly Installments. Enjoy two-hour delivery from an AppleStore free delivery or easy pickup. Have a question? Call a Specialist or chat online. Call 1-800-MY-APPLE.
1,129
BAD
Apple Makes Major Progress on No-Prick Blood Glucose Tracking for Its Watch (bloomberg.com) To continue please click the box below to let us know you're not a robot. Please make sure your browser supports JavaScript and cookies and that you are not blocking them from loading. For more information you can review our Terms of Service and Cookie Policy . For inquiries related to this message please contact our support team and provide the reference ID below.
1,130
BAD
Apple Maps location scan spikes WiFi latency every 60 seconds (twitter.com/benskuhn) Weve detected that JavaScript is disabled in this browser. Please enable JavaScript or switch to a supported browser to continue using twitter.com. You can see a list of supported browsers in our Help Center. Help Center Terms of Service Privacy Policy Cookie Policy Imprint Ads info 2023 X Corp.
1,131
BAD
Apple Music Classical (applemusic.apple) Find it on the App Store on iPhone. Apple Music Classical elevates your listening experience like never before. Enjoy the worlds largest classical music catalog a powerful search designed specifically for its nuances and the highest audio quality available up to 192kHz/24-bit Hi-Res Lossless. 1 All included with your Apple Music subscription. 2 Two extraordinary apps. Just onesubscription. Apple Music Classical is included in your Apple Music subscription at no additional cost. If youve already joined Apple Music simply download the Apple Music Classical app to start listening. If youre not a subscriber yet join today and start exploring the world of Apple MusicClassical. The worlds largest classical musiccatalog. Over 5 million tracks. Discover the very best recordings of Beethovens Sixth Symphony. Or outstanding performances from the worlds most renowned orchestras. Just about anything youd ever want to hear all of it just a few taps away. Listen Now Download now Easily search by work composer conductor andmore. Apple Music Classical features a search created specifically for the nuances of the music. You can find the exact recording you want whether you search by composer opus number or key. When you do find what youre looking for the apps unique interface ensures youll always see essential information from each work including complete movement titles and a full list of contributing artists. Listen Now Download now Experience the highest audioquality. With classical music every nuance matters so Apple Music Classical offers the highest audio quality available up to 192kHz/24-bit Hi-Res Lossless. Plus thousands of recordings available in immersive Spatial Audio featuring Dolby Atmos. 3 Listen Now Download now Benefit from expertrecommendations. Apple Music Classical is curated by classical music experts who are passionate about helping both beginner and seasoned music lovers discover great works. Users can view definitive recordings through Editor's Choice selections and enjoy exclusive playlists covering key composers instruments or periods. Listen Now Download now Browse endlessly and learn as yougo. The Browse tab helps you discover more about classical music guiding you through composers periods genres instruments and more. Explore artist and composer bios or insightful guides to works all written by expert writers and reviewers of classical music. Listen Now Download now In case you missedanything. Everyone. Beginners can begin their exploration with amazing content from Apple Music Classical editors including The Story of Classical audio companion guide and Essentials playlists for classical subgenres composers instruments and more. Classical experts can use Apple Music Classicals powerful search to find any desired recording with just a fewkeywords. Classical music often involves multiple musicians recording works that have been recorded many times before and are referred to by different names. For example from the formal Beethovens Piano Sonata No. 14 to the popular byname of Moonlight Sonata or in multiple languages such as Mondschein Sonata in German. Such complexities mean that classical music fans have been ill-served by streaming platforms. Until now. A distinct app included with an Apple Music subscription gives these classical music lovers the editorial and catalog content theyve been missing. Only a brand-new app with specialized features and a beautiful interface designed for the genre could remove the complexity and make classical music easily searchable browsable and accessible for beginners and expertsalike. Start by checking that Apple Music is installed on your smartphone and that you have an active Apple Music subscription. Then simply download and install the Apple Music Classical app on the same device and youre allset. No. Apple Music Classical is completely focused on classicalmusic. Yes. Apple Music Classical is the new way to explore and enjoy classical music. But Apple Music will also offer the worlds largest classical audio catalog along with hundreds of handpicked classicalplaylists. Yes. Your classical collection is synced between the two apps because they share the same music library. So any classical tracks albums and playlists saved in Apple Music will also be available in Apple Music Classical - and viceversa. Apple Music Classical was built exclusively for mobile and is available on iOS with Android coming soon. Classical fans who want to listen on their MacBook iPad or in their car can open Apple Music to enjoy the tracks albums and playlists they saved in Apple Music Classical thanks to the shared musiclibrary. Theres no separate subscription for Apple Music Classical. Its included at no extra cost with all Apple Music subscriptions except the Apple Music Voice Plan. So one subscription gives listeners access to two extraordinary music apps featuring more than 100 million tracks spanning everygenre. Available worldwide wherever Apple Music is offered excluding China Japan Korea Russia Taiwan andTurkey.
1,132
BAD
Apple Music Sing (apple.com) Text of this article December 6 2022 PRESS RELEASE Apple introduces AppleMusicSing Apple Music expands its world-class lyrics experience with a new feature for fans to easily sing along to tens of millions of songs CUPERTINO CALIFORNIA Apple today announced Apple Music Sing an exciting new feature that allows users to sing along to their favorite songs with adjustable vocals 1 and real-time lyrics. Apple Music Sing offers multiple lyric views to help fans take the lead perform duets sing backup and more all integrated within Apple Musics unparalleled lyrics experience.Coupled with an ever-expanding catalog that features tens of millions of the worlds most singable songs Apple Music Sing makes it fun and easy for anyone to participate however and wherever they choose. Apple Music Sing will be available later this month to Apple Music subscribers worldwide and can be enjoyed on iPhone iPad and the new Apple TV 4K. Apple Musics lyrics experience is consistently one of the most popular features on our service said Oliver Schusser Apples vice president of Apple Music and Beats. We already know our users all over the world love to follow along to their favorite songs so we wanted to evolve this offering even further to enable even more engagement around music through singing. Its really a lot of fun our customers are going to love it. Apple Music Sing includes: Apple Music will also be launching a suite of more than 50 dedicated companion playlists featuring all of the epic songs duets choruses and anthems that have been compelling people all around the world to sing fully optimized for the Apple Music Sing experience. Availability Press Contacts Jessica Bass Apple jessica_bass@apple.com Apple Media Helpline media.help@apple.com Images in this article Apple Music Sing one-sheet Apple revolutionized personal technology with the introduction of the Macintosh in 1984. Today Apple leads the world in innovation with iPhone iPad Mac Apple Watch and Apple TV. Apples five software platforms iOS iPadOS macOS watchOS and tvOS provide seamless experiences across all Apple devices and empower people with breakthrough services including the App Store Apple Music Apple Pay and iCloud. Apples more than 100000 employees are dedicated to making the best products on earth and to leaving the world better than we found it. Apple loves music. Apple revolutionized the music experience with iPod and iTunes. Today the award-winning Apple Music celebrates musicians songwriters producers and fans with a catalog of over 100 million songs expertly curated playlists and the best artist interviews conversations and global premieres with Apple Music Radio. With original content from the most respected and beloved people in music autoplay time-synced lyrics lossless audio and immersive sound powered by Spatial Audio with Dolby Atmos Apple Music offers the worlds best listening experience helping listeners discover new music and enjoy their favorites while empowering the global artist community. Apple Music is available in over 165 countries and regions on iPhone iPad iPod touch Mac Apple Watch Apple TV HomePod mini CarPlay and online at music.apple.com plus popular smart speakers smart TVs and Android and Windows devices. Apple Music is ad-free and never shares consumer data with third parties. More information is available at apple.com/apple-music . Jessica Bass Apple jessica_bass@apple.com Apple Media Helpline media.help@apple.com The latest news and updates direct from Apple. Read more
1,134
BAD
Apple Music is the most buggy and annoying software I use (nephics.se) There is no other software that I use and pay for that can compare to Apple Music when it comes to severe bugs and annoyances. The worst of the Apple Music clients is that on macOS. Here is my list of the top 10 most annoying things about Apple Music on macOS: Adding a currently playing song to a playlist will stop the music and clear the play queue. Hitting play will start the music at some sound level after about a second or two the sound level is suddenly reduced (and stays at that level until hitting pause and play again). The UI is really lacking. Browsing for music to play is insanely slow. Pausing the playing music leaving the computer and coming back some time say a day after hitting play will not always resume the music. Often the song/playlist just disappears (after hitting play) and all you get is silence and a blank play queue. Many of Apple's apps have the ability to handover between devices e.g. when reading a web-page on my phone I get a Safari icon marked with a phone in the dock on my mac clicking the icon allow me to see the same web page on my mac. That is nice but it also makes it even more annoying that Apple Music can't handover between devices. Why can't I see the playlist of upcoming songs when playing a station. (What is a station anyway and why can't I bookmark/save my stations like a playlist!?) Why can't I click the name of an artist or album in a playlist? Clicking the dots menu for a song in a playlist I can choose to: View the album in the library Show in Apple Music Show in iTunes Store. What is even all this? How about just letting me see the album I really don't care in which of Apple's differently branded places the album is! If I choose to like a song why is it not automatically saved to the so-called library. Clicking the little album cover will transform the window into a miniplayer but how to restore the main window? The (very non-obvious) answer is: click the close window button! (There are lots of other weird or inconsistent UI choices like this.) Literally WTF Apple!!1!? Why is the Apple Music software so bad? Here are some suggestions for Apple to improve Apple Music on macOS: Jun 04 2022 [ home | about | rss feed ]
1,133
BAD
Apple Passkey (developer.apple.com) Please turn on JavaScript in your browser and refresh the page to view its content.
1,136
BAD
Apple Publishes First-Ever App Store Transparency Report (macrumors.com) Apple today published an inaugural App Store Transparency Report something that the company agreed to provide to developers as part of a 2021 lawsuit settlement . Under the terms of the settlement Apple promised to provide developers with meaningful statistics about the app review process including the number of apps that were rejected the number of customer and developer accounts deactivated the number of apps removed from the App Store data regarding search queries and results and more. All of the information that developers requested is in the 2022 App Store Transparency Report and the supplemental data that accompanies it with the report available for download from Apple's legal site . In 2022 there were 1783232 apps on the App Store with 6101913 total app submissions received and 1679694 apps rejected for various reasons like safety performance design and legal. Apple provides numbers on the specific App Store guidelines that were violated by rejected apps with the highest number of single rule rejections (149378) due to violations of the Design 4.0 rule and the DPLA 3.2 Fraud rule (32009). A total of 253466 app submissions were approved after rejection when developers worked with Apple to resolve issues and 186195 apps were removed from the App Store for breaking the App Store rules. The majority of apps removed from the App Store were games followed by Utilities Business and Education. Apple outlines the total number of apps removed from the App Store due to government takedowns and China is at the top of the list. The Chinese government asked Apple to remove 1435 apps but 1276 of those apps were games that were removed for not having the GRN license that China requires. Apple removed 14 apps at the request of India's government 10 apps for Pakistan and seven apps for Russia. In other countries including Trkiye Bulgaria Cyprus Hong Kong Italy Latvia and Nigeria fewer than two apps were removed at the government's request. Developers appealed 18412 app removals in total and Apple restored just 616 developer accounts. Apple says that apps that are appealed were typically pulled from the App Store for fraud or illegality which is why the rejected appeal number is so high. There are 36974015 registered developers and in 2022 Apple terminated 428487 developer accounts. According to Apple developers are removed from the Apple Developer Program for a number of reasons but most commonly because of accounts that are connected with other terminated developer accounts. 3338 developers appealed their App Store bans and Apple reinstated just 159 accounts. Again Apple says that this is because most developer account terminations that are appealed are removed from the App Store due to fraud so Apple rejects most of them. 282036628 customer accounts were terminated but that number does incorporate all accounts created even those made on the website by non-iPhone and iPad users. There were 656739889 average weekly visitors to the App Store and 747873877 average weekly app downloads. Customer accounts searched the App Store 373211396 times on average and 1399741 apps appeared in the top 10 results of at least 1000 searches. Additional breakdowns can be found in Apple's supplemental data . Apple plans to provide these App Store Transparency Reports to developers on an annual basis going forward. Get weekly top MacRumors stories in your inbox. A selection of macOS tips to make your Mac life a more effortless experience. A selection of quick iOS tips that will make you a lot more time-efficient in the long run. 50 features and changes you might have missed in macOS Ventura. Apple on May 18 released iOS 16.5 delivering several sports-related enhancements for Apple News a new Pride Celebration wallpaper and several important bug and security fixes. Apple's new AR/VR headset is expected to be unveiled along with iOS 17 macOS 14 and more. Apple's most powerful Mac will finally shift to Apple silicon. Apple's AR/VR headset is coming soon with eye- and gesture-tracking dual 4K displays M-series chips and more. Here's what we know so far. Next-generation version of iOS set to be previewed at WWDC 2023 in June with a public release in September. 4 days ago by Joe Rossignol 6 days ago by Joe Rossignol 6 days ago by Hartley Charlton 1 week ago by Hartley Charlton 1 week ago by Hartley Charlton MacRumors attracts a broad audience of both consumers and professionals interested in the latest technologies and products. We also boast an active community focused on purchasing decisions and technical aspects of the iPhone iPod iPad and Mac platforms.
1,141
BAD
Apple Rankings (applerankings.com) This list is 100% accurate. Copyright 2023 Apple Rankings by The Appleist Brian Frange | https://www.brianfrange.com
1,142
BAD
Apple Reports Fourth Quarter Results (apple.com) Consolidated Financial Statements This press release contains forward-looking statements within the meaning of the Private Securities Litigation Reform Act of 1995. These forward-looking statements include without limitation those about the payment of the Companys quarterly dividend its environmental initiatives and its plans for new features products and long-term growth. These statements involve risks and uncertainties and actual results may differ materially from any future results expressed or implied by the forward-looking statements. Risks and uncertainties include without limitation: effects of global and regional economic conditions including as a result of government policies war terrorism natural disasters and public health issues; risks relating to the design manufacture introduction and transition of products and services in highly competitive and rapidly changing markets including from reliance on third parties for components technology manufacturing applications and content; risks relating to information technology system failures network disruptions and failure to protect loss of or unauthorized access to or release of data; and effects of unfavorable legal proceedings government investigations and complex and changing laws and regulations. More information on these risks and other potential factors that could affect the Companys business reputation results of operations financial condition and stock price is included in the Companys filings with the SEC including in the Risk Factors and Managements Discussion and Analysis of Financial Condition and Results of Operations sections of the Companys most recently filed periodic reports on Form 10-K and Form 10-Q and subsequent filings. The Company assumes no obligation to update any forward-looking statements or information which speak as of their respective dates. Apple revolutionized personal technology with the introduction of the Macintosh in 1984. Today Apple leads the world in innovation with iPhone iPad Mac Apple Watch and Apple TV. Apples five software platforms iOS iPadOS macOS watchOS and tvOS provide seamless experiences across all Apple devices and empower people with breakthrough services including the App Store Apple Music Apple Pay and iCloud. Apples more than 100000 employees are dedicated to making the best products on earth and to leaving the world better than we found it. Apple Media Helpline media.help@apple.com 2022 Apple Inc. All rights reserved. Apple and the Apple logo are trademarks of Apple. Other company and product names may be trademarks of their respective owners. The latest news and updates direct from Apple. Read more
1,144
BAD
Apple Restricts Employee Use of ChatGPT Joining Other Companies Wary of Leaks (wsj.com) WSJ Membership Customer Service Tools & Features Ads More Dow Jones Products
1,145
BAD
Apple Shipped Me a 79-Pound iPhone Repair Kit to Fix a 1.1-Ounce Battery (theverge.com) By Sean Hollister a senior editor and founding editor of The Verge who covers gadgets games and tech toys. He's spent 15 years editing the likes of CNET Gizmodo and Engadget. Apple must be joking. Thats how I felt again and again as I jumped through hoop after ridiculous hoop to replace the battery in my iPhone Mini. Part of that was the repair process mostly it was how difficult Apple makes it to even get there. Last month Apple launched its Self-Service Repair program letting US customers fix broken screens batteries and cameras on the latest iPhones using Apples own parts and tools for the first time ever. I couldnt wait. Id never successfully repaired a phone and my wife has never let me live down the one time I broke her Samsung Galaxy while using a hair dryer to replace the screen. This time armed with an official repair manual and genuine parts Id make it right. That Apple would even let me buy those parts much less read its manuals and rent its tools is a major change of pace for the company. For years Apple has been lobbying to suppress right-to-repair policies around the country with the company accused of doing everything it can to keep customers from repairing their own phones. Its easy to see this as a huge moment for DIY advocates . But having tried the repair process I actually cant recommend it at all and I have a sneaking suspicion that Apple likes it that way. The thing you should understand about Apples home repair process is that its a far cry from traditional DIY if you opt for the kit which I did once I saw the repair manual only contains instructions for Apples own tools. (You can just buy a battery if you want.) I expected Apple would send me a small box of screwdrivers spudgers and pliers; I own a mini iPhone after all. Instead I found two giant Pelican cases 79 pounds of tools on my front porch. I couldnt believe just how big and heavy they were considering Apples paying to ship them both ways. I lugged those cases onto a BART train to San Francisco and dragged them down the streets to our office. Then I set everything out on a table and got started. Step one of opening an iPhone is basically using a hefty machine to suck the screen off the top.Here I wasnt microwaving a jelly-filled sock to loosen the Apple goop holding my screen down! Apple lets you rent an industrial-grade heat station that looks like a piece of lab equipment right down to the big red safety dial you twist to release the emergency-off button and the suction-cup-tipped mechanical lifting arm.It looks pretty cool. I slip my phone in a perfectly sized heating pocket that clamps a ring of copper around the iPhones band to evenly distribute the heat and melt the seal around the screen realize in horror that Ive invited the Hot Pockets! jingle to live in my head rent-free then spin a dial to raise the arm that separates the iPhones screen from its body. Or thats how its supposed to work anyhow. The heating machine threw an error code partway through my first attempt and Apples manual didnt explain what to do if that happens after youve stuck your phone inside. So I wound up heating it twice in a row. And yet that still wasnt quite enough for my screen to immediately pop up when the suction cup arm began to lift the glass. The manual did cover that situation making me spin a second hidden knob to put more pressure on the suction cup but I started freaking out when I saw what looked like cracks spider across the screen. (It turned out it was just suction cup residue.) Once the screen was loose I cut through the softened glue holding it to the iPhones frame with Apples single tiny adhesive cutter which also gave me a little trouble. The blade got caught when I wedged it under the corners of the screen and I had to yank it out without accidentally sending my phone clattering to the ground. The kit comes with a perfect-fit tray to hold your phone steady and extra suction cups to hold the screen without stretching the fragile ribbon cables but nothing to hold the tray itself. Apple also provides a set of fancy torque drivers to make sure you dont screw down the phones tiny screws too tightly but its a bit of a chore. I must have dropped Apples incredibly tiny fasteners a dozen times while removing the slivers of metal that hold the screens ribbon cables in place as well as the bottom speaker that Apple makes you yank to get at the battery. Presumably just to make it more difficult to repair Apple requires three different screwdriver bits just to remove the screen and none of Apples bits are magnetized to keep the screws from slipping. At this point there was still a bunch of goopy adhesive around the sides of my iPhones frame. While instructions suggest itll just peel off in a few big pieces if you pull with tweezers ( which didnt come in the box ) I gave up after 10 minutes of picking away at tiny fuzzy blobs of glue. I was just going to be adding more adhesive anyway after all. Later I discovered this was not my best idea. When it finally lay open on the table I couldnt help but gawk in wonder at my iPhone 13 Minis precisely packed guts and I realized I was having fun! Slicing open my phone was a thrill. But a lot of that thrill came from not knowing whether my phone would survive surgery Apple tools or no. From there it was time to swap out the battery. Once I finished cursing at the far too easy to tear tabs that held in the original lithium pack I used Apples fancy battery press with a rolling arm to seat but not squish the new battery down. But I could have done that with my fingertips; Id have much rather had a tool to properly align the battery which I had to yank and reposition after plopping it down a millimeter too far south or a tool to test whether youve properly reseated the battery and display connectors. But Ill get to that. Next the instructions had me apply an actually helpful precut adhesive sheet designed to stick my screen back to the frame which was easy to slot into exactly the right place and press down with my fingers. Then came a huge spring-loaded press (with a veritable slot machine of an arm) to close the phone once more. But even with the press my screen wasnt perfectly flush with the frame afterwards perhaps due to the extra glue I didnt manage to remove. With my phone closed up again I held down the power button. Nothing. No bright white Apple logo no response at all. For one horrible moment I realized Apple gave me no way to test whether the battery and display connectors were actually seated (they probably werent) and had me close up the phone anyhow. Then a forlorn hope: maybe the replacement battery shipped empty? I scrambled around the office for a Lightning cable and my iPhone 13 Mini finally lit up. But I wasnt done yet. The single most frustrating part of this process after using Apples genuine parts and Apples genuine tools was that my iPhone didnt recognize the genuine battery as genuine. Unknown Part flashed a warning. Apparently thats the case for almost all of these parts: youre expected to dial up Apples third-party logistics company after the repair so they can validate the part for you. Thats a process that involves having an entirely separate computer and a Wi-Fi connection since you have to reboot your iPhone into diagnostics mode and give the company remote control. Which of course defeats a bunch of the reasons youd repair your own device at home! And if Im telling you the truth the second most frustrating part didnt occur during the repair either. If it were just me Id have aborted the entire process before Apple ever shipped 79 pounds of equipment to my home. It would be an understatement to say that Apple has a history of resisting right-to-repair efforts. Before the iPhone replacing a battery was typically as easy as inserting a thumbnail to pop off your phones back cover; afterwards phones largely became tricky to even open without specialized tools which arguably pushed customers to replace their perfectly good devices when they might have only needed a new screen or battery. Also see: batterygate . In recent years the company has actively lobbied against right-to-repair legislation in at least 20 states sneakily pushing California as one example to postpone its bill . (The bill died in committee again this very week .) Apple cracked down on unauthorized repairs by throwing warnings or even disabling features if you repair phones with non-genuine parts though it walked some of that back after an outcry. And it put together a contract for indie repair shops that was reportedly so invasive many refused to sign it. So it didnt surprise me when Apples press release about the program warned the vast majority of customers away from their own repairs or when I needed to enter my phones IMEI to prove I owned my phone or how I had to enter a six-digit code to prove I read the repair manual which not only suggests you need three pages worth of tools but also a jar of sand in case your battery catches fire one of many not strictly necessary items that dont come with the kit. Apple also only includes instructions on how to use its own special tools for repairs so youre on your own if you want to try a more low-key or inexpensive DIY approach. Yeah none of that surprised me. What surprised me was the price tag. Lets be clear: this is a ridiculous amount of risk for the average person who just wants to put a new battery in their phone. And its frankly weird for Apple to insist on you covering the full value of the tools. Its not like when you rent a car they make you put down $20000 as a safety deposit my colleague Mitchell Clark points out. I should also mention the Pelican cases landed at my door two days before the battery arrived so I only had five days to do the job before that $1200 deadline. The more I think about it the more I realize Apples Self-Service Repair program is the perfect way to make it look like the company supports right-to-repair policies without actually encouraging them at all. Apple can say its giving consumers access to everything even the same tools its technicians use while scaring them away with high prices complexity and the risk of losing a $1200 deposit. This way Apple gets credit for walking you through an 80-page repair instead of building phones where say you dont need to remove the phones most delicate components and two different types of security screws just to replace a battery. Shipping two 40-pound Pelicans cant be cheap To me those giant Pelican cases are the proof. It would cost Apple a fortune to ship 79 pounds of equipment to individual homes all over the country even with corporate discounts. The Verge is obviously far far smaller than Apple but it would cost us upwards of $200 just to return those cases to their sender. Yet Apple offers free shipping both directions with your $49 rental plus a dedicated support team to validate your parts and facilitate returns. (Though apparently it doesnt do the latter anywhere near its Silicon Valley HQ: when I took the support team up on its offer of picking up my battery they told me they didnt have a driver within 250 miles of my location and I should just drop it off at the nearest Home Depot.) I dont think Apple expects anyone to seriously take it up on the offer of self-service repair kits. It stacked the deck in favor of taking your phone to an Apple Store where it can tempt you to buy something new instead. The real victory will come months or years down the road though. Thats when Apple can tell legislators it tried to give right-to-repair advocates what they wanted but that consumers overwhelmingly decided Apple knows best. / Sign up for Verge Deals to get deals on products we've tested sent to your inbox daily. The Verge is a vox media network 2023 Vox Media LLC. All Rights Reserved
1,149
BAD
Apple TV prompt requires another Apple device (twitter.com/hugelgupf) Weve detected that JavaScript is disabled in this browser. Please enable JavaScript or switch to a supported browser to continue using twitter.com. You can see a list of supported browsers in our Help Center. Help Center Terms of Service Privacy Policy Cookie Policy Imprint Ads info 2023 X Corp.
1,155
BAD
Apple Unveils M2 (apple.com) Text of this article June 6 2022 PRESS RELEASE Apple unveils M2 taking the breakthrough performance and capabilities of M1 even further M2 delivers new levels of power-efficient performance and capabilities to the completely redesigned MacBook Air and updated 13-inch MacBook Pro CUPERTINO CALIFORNIA Apple today announced M2 beginning the next generation of Apple silicon designed specifically for the Mac.Built using second-generation 5-nanometer technology M2 takes the industry-leading performance per watt of M1 even further with an 18 percent faster CPU a 35 percent more powerful GPU and a 40 percent faster Neural Engine. 1 It also delivers 50 percent more memory bandwidth compared to M1 and up to 24GB of fast unified memory. M2 brings all of this plus new custom technologies and greater efficiency to the completely redesigned MacBook Air and updated 13-inch MacBook Pro. M2 starts the second generation of M-series chips and goes beyond the remarkable features of M1 said Johny Srouji Apples senior vice president of Hardware Technologies. With our relentless focus on power-efficient performance M2 delivers a faster CPU GPU and Neural Engine. And along with higher memory bandwidth and new capabilities like ProRes acceleration M2 continues the tremendous pace of innovation in Apple silicon for the Mac. More Transistors More Memory The system-on-a-chip (SoC) design of M2 is built using enhanced second-generation 5-nanometer technology and consists of 20 billion transistors 25 percent more than M1. The additional transistors improve features across the entire chip including the memory controller that delivers 100GB/s of unified memory bandwidth 50 percent more than M1. And with up to 24GB of fast unified memory M2 can handle even larger and more complex workloads. Faster Power-Efficient Performance The new CPU features faster performance cores paired with a larger cache while the efficiency cores have been significantly enhanced for even greater performance gains. Together they deliver 18 percent greater multithreaded performance than M1 so M2 can rip through CPU-intensive tasks using very little power like creating music with layers of effects or applying complex filters to photos. 1 Compared with the latest 10-core PC laptop chip the CPU in M2 provides nearly twice the performance at the same power level. And M2 delivers the peak performance of the PC chip while using just a quarter of the power. 2 When compared to the latest 12-core PC laptop chip which needs dramatically more power to deliver an increase in performance and is therefore found in thicker hotter more noisy systems with less battery life M2 provides nearly 90 percent of the peak performance of the 12-core chip while using just one-fourth the power. 3 M2 also features Apples next-generation GPU with up to 10 cores two more than M1. Combined with a larger cache and higher memory bandwidth the 10-core GPU delivers a big boost in graphics performance bringing up to 25 percent higher graphics performance than M1 at the same power level and up to 35 percent better performance at its max power. 1 Compared with the integrated graphics of the latest PC laptop chip the GPU in M2 delivers 2.3x faster performance at the same power level and matches its peak performance using a fifth of the power. 2 The higher performance per watt from M2 enables systems to have exceptional battery life and run cool and quietly even when playing graphics-intensive games or editing massive RAW images. Apples Next-Generation Custom Technologies M2 brings Apples latest custom technologies to the Mac enabling new capabilities better security and more: macOS M2 and Apps macOS has been designed for Apple silicon and the combination of macOS Monterey and the powerful new M2 delivers breakthrough performance and productivity for users. Mac computers powered with Apple silicon have access to the largest collection of apps ever for Mac including iPhone and iPad apps that can run on Mac and universal apps that unlock the full power of M-series chips. macOS Ventura coming this fall will also take full advantage of M2 bringing new features including Stage Manager and powerful new capabilities with Continuity Camera and Handoff in FaceTime. macOS Ventura also includes big updates to Safari Mail Messages Spotlight and more. Apple Silicon and the Environment The power-efficient performance of M2 helps the new MacBook Air and 13-inch MacBook Pro meet Apples high standards for energy efficiency. Today Apple is carbon neutral for global corporate operations and by 2030 plans to have net-zero climate impact across the entire business which includes manufacturing supply chains and all product life cycles. This means that every chip Apple creates from design to manufacturing will be 100 percent carbon neutral. About Apple Apple revolutionized personal technology with the introduction of the Macintosh in 1984. Today Apple leads the world in innovation with iPhone iPad Mac Apple Watch and Apple TV. Apples five software platforms iOS iPadOS macOS watchOS and tvOS provide seamless experiences across all Apple devices and empower people with breakthrough services including the App Store Apple Music Apple Pay and iCloud. Apples more than 100000 employees are dedicated to making the best products on earth and to leaving the world better than we found it. Press Contacts Apple Media Helpline media.help@apple.com Images in this article Apple revolutionized personal technology with the introduction of the Macintosh in 1984. Today Apple leads the world in innovation with iPhone iPad Mac Apple Watch and Apple TV. Apples five software platforms iOS iPadOS macOS watchOS and tvOS provide seamless experiences across all Apple devices and empower people with breakthrough services including the App Store Apple Music Apple Pay and iCloud. Apples more than 100000 employees are dedicated to making the best products on earth and to leaving the world better than we found it. Apple Media Helpline media.help@apple.com The latest news and updates direct from Apple. Read more
1,157
BAD
Apple acquires UK open banking startup Credit Kudos (ffnews.com)
1,084
BAD
Apple announces full Swift rewrite of the Foundation framework (2022) (infoq.com) QCon San Francisco (Oct 2-6): Learn what's next in software from world-class leaders pushing the boundaries. In-person or video-only pass to recordings. Register Facilitating the Spread of Knowledge and Innovation in Professional Software Development Back to login Back to login In this article we share findings and insights about the Rust community and ecosystem and elaborate on the peculiarities and pitfalls of starting new projects with Rust or migrating to Rust from other languages. Chris Richardson describes how dark energy and dark matter are excellent metaphors for the competing forces (a.k.a. concerns) that must be resolved by the microservices pattern language. Mehrnoosh Sameki discusses approaches to responsible AI and demonstrates how open source and cloud integrated ML help data scientists and developers to understand and improve ML models better. Katharine Jarmul confronts techno-solutionism exploring ethical machine learning which eventually led her to specialize in data privacy. This article shares insights into learning eBPF as a new cloud-native technology which aims to improve Observability and Security workflows. Learn how to practice using the tools and dive into your own development. Iterate on your knowledge step-by-step and follow-up with more advanced use cases later. Find real-world practical inspiration from the worlds most innovative software leaders. Attend in-person. Learn what's next in software from world-class leaders pushing the boundaries. Attend in-person or get video-only pass to recordings. Your monthly guide to all the topics technologies and techniques that every professional needs to know about. Subscribe for free. InfoQ Homepage News Apple Announces Full Swift Rewrite of the Foundation Framework Dec 14 2022 2 min read by Sergio De Simone The Swift team has started to work on a new open-source implementation of the Foundation framework . To be written entirely in Swift the new Foundation aims to improve performance by getting rid of conversion costs between Objective-C and Swift as well as to provide the opportunity for modularizing and clean it up. With a native Swift implementation of Foundation the framework no longer pays conversion costs between C and Swift resulting in faster performance. As Apple engineer Tony Parker explained on Swift forum the performance benefit can be significant: A reimplementation of Calendar in Swift is 1.5x to 18x as fast as the C one (calling from Swift in various synthetic benchmarks like creation date calculation). The Foundation framework is a cornerstone of most macOS and iOS apps providing a large number of ubiquitous abstractions including NSObject NSString NSArray and NSDictionary and many more. Originally written in Objective-C and largely drawing on its dynamic properties Foundation can be currently used in Swift apps that run on Apple OSes thanks to Swift interoperability with the Objective-C runtime. This is not an option though for other platforms where Swift is supported but only the original GNUstep Objective-C runtime is available such as Linux. To circumvent this limitation when Apple open sourced Swift they built swift-corelibs-foundation a Swift wrapper around Core Foundation a lower-level C API lying at the heart of Foundation and providing toll-free bridged abstractions mapping to Foundation including CFString CFDictionary and so on. As mentioned the Swift rewrite will be the occasion for rearchitecturing the Foundation framework. At the moment five distinct packages are devised including FoundationEssentials FoundationInternationalization FoundationNetworking FoundationXML and FoundationObjCCompatibility . This will allow developers to bring in only the dependencies they actually need for a specific app. For example a developer starting from scratch and wanting to cut all ties to the Darwin Foundation layer could skip the Obj-C compatibility layer altogether. With the decision to rewrite Foundation Apple is swapping the ordering between C Objective-C and Swift layers and making the Swift layer the underpinning to Foundation across all platforms. This will bring the opportunity to remove a number of features that are currently provided directly from the Swift language and standard library such as NSCoding and NSKeyedArchiver replaced by Codable or Lock OperationQueue and other concurrency primitives that are replaced by Swift structured concurrency . Those features will be provided by a separate package available on the Darwin platform to ensure compatibility with existing code though. The announcement prompted many comments on the Swift Forums most of them hinting at the convenience of giving several existing APIs a rethink as well as of ensuring Swift-types such as Codable or structured concurrency are made powerful enough to fully replace the Obj-C types that are slated for removal from the core Foundation layer. The Foundation rewrite is foreseeably a medium-term effort for which no schedule has been announced yet and Apple is inviting the Swift community to join the discussion to define its future. Uncover emerging trends and practices from domain experts. Level-up your skills at QCon New York (June 13-15 2023)! Writing for InfoQ has opened many doors and increased career opportunities for me. I was able to deeply engage with experts and thought leaders to learn more about the topics I covered. And I can also disseminate my learnings to the wider tech community and understand how the technologies are used in the real world. I discovered InfoQs contributor program earlier this year and have enjoyed it since then! In addition to providing me with a platform to share learning with a global community of software developers InfoQs peer-to-peer review system has significantly improved my writing . If youre searching for a place to share your software expertise start contributing to InfoQ. I started writing news for the InfoQ .NET queue as a way of keeping up to date with technology but I got so much more out of it. I met knowledgeable people got global visibility and improved my writing skills . Becoming an editor for InfoQ was one of the best decisions of my career . It has challenged me and helped me grow in so many ways . We'd love to have more people join our team . InfoQ seeks a full-time Editor-in-Chief to join C4Media's international always remote team. Join us to cover the most innovative technologies of our time collaborate with the world's brightest software practitioners and help more than 1.6 million dev teams adopt new technologies and practices that push the boundaries of what software and teams can deliver! A round-up of last weeks content on InfoQ sent out every Tuesday. Join a community of over 250000 senior developers. View an example We protect your privacy. You need to Register an InfoQ account or Login or login to post comments. But there's so much more behind being registered. Get the most out of the InfoQ experience. Allowed html: abbrblockquoteilipreuulp Allowed html: abbrblockquoteilipreuulp Allowed html: abbrblockquoteilipreuulp A round-up of last weeks content on InfoQ sent out every Tuesday. Join a community of over 250000 senior developers. View an example We protect your privacy. June 13-15 2023. QCon New York International Software Conference returns this June 13-15. Technical leaders who are driving innovation and change in software will share the latest trends and techniques from their real-world projects to help you solve common challenges. Level-up on emerging software trends and get the assurance you're adopting the right patterns and practices. SAVE YOUR SPOT NOW InfoQ.com and all content copyright 2006-2023 C4Media Inc. Privacy Notice Terms And Conditions Cookie Policy
1,087
BAD
Apple announces upgrade to App Store pricing adding 700 new price points (9to5mac.com) Apple is making good on one of its recent promises to developers. Starting today App Store developers now have over 700 different new price points from which to choose for their apps. This brings the total number of price point options to 900 with options ranging from as low as $0.29 all the way up to $10000. Apple first said it would add new App Store price point options last year as part of its settlement of a class-action lawsuit from US developers. This is the same settlement that allowed developers to communicate with users about alternative payment solutions and included the Small Developer Assistance Fund . With todays update all developers have the ability to select from 900 different price points which is nearly 10 times the number of price points previously available for most apps. There are 600 new price points available to choose from while an additional 100 higher price points are available upon request. The new pricing options increase incrementally across price ranges. For instance prices increase by $0.10 up to $10 then every $0.50 between $10 and $50 and so on. The price points start as low as $0.29 and range all the way up to $10000. The 100 highest price points however are only available upon request as part of Apples efforts to reduce the potential for scams. The new price options are available starting today for subscription-based applications. Theyll be available for all other apps and in-app purchases starting in spring 2023. Also starting today developers of subscription apps can manage currency and taxes across storefronts more effortlessly. This will also expand to all other applications in spring 2023. Starting today developers of subscription apps will also be able to manage currency and taxes across storefronts more effortlessly by choosing a local storefront they know best as the basis for automatically generating prices across the other 174 storefronts and 44 currencies. Developers will still be able to define prices per storefront if they wish. Finally starting in 2023 Apple says that developers with paid apps and in-app purchases will be able to set local territory pricing which will not be impacted by automatic price adjustments. This comes as an option for developers who dont want their app prices to change when Apple updates app pricing in certain regions in response to tax and foreign exchange rate fluctuations. FTC: We use income earning auto affiliate links. More. Check out 9to5Mac on YouTube for more Apple news: Chance is an editor for the entire 9to5 network and covers the latest Apple news for 9to5Mac. Tips questions typos to chance@9to5mac.com
1,088
BAD
Apple asks suppliers to shift AirPods Beats production to India (nikkei.com) iPhone maker taps South Asian nation again in push to diversify from China TAIPEI -- Apple is asking suppliers to move some AirPods and Beats headphone production to India for the first time in a win for the South Asia nation as it attempts to rise in the global supply chain. The move is part of Apple's gradual diversification from China as it looks to lower the risk of supply chain disruptions stemming from the country's strict zero-COVID policy and tensions with the U.S. Discover the all new Nikkei Asia app Nikkei Asian Review now known as Nikkei Asia will be the voice of the Asian Century. By continuing to browse this website you accept cookies which are used for several reasons such as personalizing content/ads and analyzing how this website is used. Please review our Cookie Policy to learn how you can update your cookie settings.
1,089
BAD
Apple avoids job cuts because it didnt overhire like Google and Amazon (bloomberg.com) To continue please click the box below to let us know you're not a robot. Please make sure your browser supports JavaScript and cookies and that you are not blocking them from loading. For more information you can review our Terms of Service and Cookie Policy . For inquiries related to this message please contact our support team and provide the reference ID below.
1,091
BAD
Apple becomes first tech giant to explicitly ban caste discrimination (indiatoday.in) By Abhik Sengupta : Most tech giants have elaborate anti-discriminatory work policies but unlike some topics -- say race -- they do not explicitly call out caste-based discrimination. Now Apple has become one of the first tech giants to explicitly call out and ban caste discrimination in the company. Given that the idea of caste which is well-known to people in India might not be familiar to managers and employees in the US Apple has also started training on the topic so that its workers understand the new policies better. According to Reuters Apple updated its general employee conduct policy to prohibit caste-based discrimination nearly two years ago but it was unreported so far. The new policy sits alongside existing codes that strictly prohibit discrimination against race religion gender age and ancestry. The report notes that the update came in June 2020 when California's employment regulator sued Cisco Systems on behalf of a low-caste engineer who accused two higher-caste seniors of blocking his career. The incident was viewed as the first US employment lawsuit about alleged casteism and it forced major tech players to confront reality which is seemingly going outside Indian borders. Apple reportedly confirmed updating the policy to the publication. It said the company updated language a couple of years ago to reinforce that it prohibits discrimination or harassment based on caste. It adds that training provided to staff also explicitly mentions caste. The Cupertino-based tech giant also said We have a diverse and global team and are proud that our policies and actions reflect that. Apart from Apple IBM also seemingly updated its policy to include anti-caste rules. Reuters report that for now IBM is only training its managers on the topic of caste. Other big tech players such as Amazon Dell Facebook owner Meta Microsoft and Google do not specifically and explicitly reference caste in their main global policy. In recent years the topic of caste and alleged caste discrimination among tech workers who have Indian heritage has led to much talk in Silicon Valley. Just a few months ago in June Google whose CEO Sundar Pichai has Indian roots cancelled a talk on caste discrimination where Dalit rights activist Thenmozhi Soundararajan was supposed to give a presentation. The talk was organised by Google employee Tanuja Gupta. After the news of the cancellation came out a report noted that Google cancelled the talk because it was leading to rancour inside the company. In the following days Tanuja was put under investigation allegedly for violating the company's code of conduct. Soon after she left Google. Recently talking about the incident to the New Yorker Tanuja said A number of e-mails got sent to my VP to the head of HR to our chief diversity officer to our CEO directly claiming that the talk was creating a hostile workplace that people felt unsafe that the speaker was not qualified to speak on the topic and several other allegations. She also claimed that in tech companies including Google discrimination based on caste was rife and that tech company needed to talk about it. --- ENDS --- Also Read | 5G India launch today? Here's what we know in 7 points Also Read | Apple's next big update: Ads on iPhones but not the way you think Also Read | A Philippines lawmaker wants to criminalise ghosting as it makes people less productive Add IndiaToday to Home Screen
1,092
GOOD
Apple blocks Coinbase Wallet (twitter.com/coinbasewallet) Weve detected that JavaScript is disabled in this browser. Please enable JavaScript or switch to a supported browser to continue using twitter.com. You can see a list of supported browsers in our Help Center. Help Center Terms of Service Privacy Policy Cookie Policy Imprint Ads info 2023 X Corp.
1,094
BAD
Apple bought out all of Newsweek's 1984 Election issue ad space for Mac (aresluna.org) Home | Polski > Computer history > Advertisements > International > Apple Computer > Macintosh Newsweek advertisement Go back | Previous | Next Apple bought all of the advertising space in November/December special election issue of Newsweek in 1984 and devoted it all to Macintosh. Below you can see all 39 pages of the advertisement as well as Newsweeks front cover. Front cover Page 0 Page 1 Page 2 Page 13 Page 14 Page 15 Page 16 Page 17 Page 18 Page 23 Page 19 Page 20 Page 21 Page 22 Page 24 Page 53 Page 54 Page 55 Page 56 Page 57 Page 58 Page 59 Page 60 Page 61 Page 62 Page 63 Page 64 Page 65 Page 66 Page 67 Page 93 Page 94 Page 95 Page 96 Page 97 Page 98 Page 99 Page 113 Page 114 (back cover) Previous | Next Page added on 21st January 2004. Copyright 2002-2005 Marcin Wichary Contact | Site map Apple bought all of the advertising space in November/December special election issue of Newsweek in 1984 and devoted it all to Macintosh. Below you can see all 39 pages of the advertisement as well as Newsweeks front cover. Front cover Page 0 Page 1 Page 2 Page 13 Page 14 Page 15 Page 16 Page 17 Page 18 Page 23 Page 19 Page 20 Page 21 Page 22 Page 24 Page 53 Page 54 Page 55 Page 56 Page 57 Page 58 Page 59 Page 60 Page 61 Page 62 Page 63 Page 64 Page 65 Page 66 Page 67 Page 93 Page 94 Page 95 Page 96 Page 97 Page 98 Page 99 Page 113 Page 114 (back cover)
1,096
BAD
Apple brings Final Cut Pro and Logic Pro to iPad (apple.com) Text of this article May 9 2023 PRESS RELEASE Apple brings FinalCutPro and LogicPro to iPad Pro apps introduce powerful features designed for iPad to deliver the ultimate mobile studio for video and music creators CUPERTINO CALIFORNIA Apple today unveiled Final Cut Pro and Logic Pro for iPad. Video and music creators can now unleash their creativity in new ways that are only possible on iPad. Final Cut Pro and Logic Pro for iPad bring all-new touch interfaces that allow users to enhance their workflows with the immediacy and intuitiveness of Multi-Touch. Final Cut Pro for iPad introduces a powerful set of tools for video creators to record edit finish and share all from one portable device. Logic Pro for iPad puts the power of professional music creation in the hands of the creator no matter where they are with a complete collection of sophisticated tools for songwriting beat making recording editing and mixing. Final Cut Pro and Logic Pro for iPad will be available on the App Store as subscriptions starting Tuesday May 23. Were excited to introduce Final Cut Pro and Logic Pro for iPad allowing creators to unleash their creativity in new ways and in even more places said Bob Borchers Apples vice president of Worldwide Product Marketing. With a powerful set of intuitive tools designed for the portability performance and touch-first interface of iPad Final Cut Pro and Logic Pro deliver the ultimate mobile studio. Final Cut Pro for iPad Final Cut Pro for iPad introduces an all-new touch interface and intuitive tools unlocking new workflows for video creators. A new jog wheel makes the editing process easier than ever and enables users to interact with content in completely new ways. They can navigate the Magnetic Timeline move clips and make fast frame-accurate edits with just the tap of a finger and with the immediacy and intuitiveness of Multi-Touch gestures push their creativity to new heights. With Live Drawing users can draw and write directly on top of video content using Apple Pencil. On iPad Pro with M2 Apple Pencil hover unlocks the ability for users to quickly skim and preview footage without ever touching the screen. 1 They can also speed up their workflows by adding a Magic Keyboard or Smart Keyboard Folio to utilize key commands. 2 Creators can view and edit stunning HDR video taking advantage of the Liquid Retina XDR display on the 12.9-inch iPad Pro and apply color grades with accuracy using Reference Mode. Pro Camera Mode and Multicam Video Editing Pro camera mode brings even greater control to the creation process on iPad. Video creators can shoot high-quality video in portrait or landscape orientation monitor audio and available recording time and manually control settings like focus exposure and white balance. Creators can capture edit and publish from a single device while in the field and on iPad Pro with M2 users can even record in ProRes. 3 With multicam video editing clips can automatically be synchronized and edited together and users can even switch angles in a multicam clip with just the touch of a finger. Fast Cut Features with Machine Learning Final Cut Pro for iPad harnesses the power of Apple silicon and machine learning to speed up time-consuming editing tasks. With Scene Removal Mask creators can quickly remove or replace the background behind a subject in a clip without using a green screen. Auto Crop adjusts footage for vertical square and other aspect ratios and with Voice Isolation background noise can easily be removed from audio captured in the field. Pro Graphics Effects and Audio Video creators can choose from a vast library of professional graphics effects and audio to enhance their storytelling. These include stunning HDR backgrounds customizable animated patterns and professional soundtracks that automatically adjust to the length of a video. Import and Export Editors can import supported media from Files or Photos and save it directly inside a Final Cut Pro project. Final Cut Pro for iPad also supports the ability to import projects created in iMovie for iOS 4 and iPad users can export their Final Cut Pro projects to Mac. 5 Logic Pro for iPad Logic Pro for iPad combines the power of Logic Pro with the portability of iPad to unlock an all-in-one professional music creation app. With Multi-Touch gestures music creators can play software instruments and interact naturally with controls as well as navigate complex projects with pinch-to-zoom and swipe-to-scroll. Plug-in Tiles put the most useful controls at the creators fingertips making it easy to quickly shape sounds. With the built-in mics on iPad users can capture voice or instrument recordings and with five studio-quality mics on iPad Pro users can turn virtually any space into a recording studio. Creators can also make precision edits and draw detailed track automation with Apple Pencil and connect a Smart Keyboard Folio or Magic Keyboard to utilize key commands that speed up production. All-New Sound Browser An all-new sound browser uses dynamic filtering to help music creators discover the perfect sound whenever inspiration strikes. The sound browser displays all available instrument patches audio patches plugin presets samples and loops in a single location and users can tap to audition any sound before loading it into a project to save time and stay in their creative flow. Professional Instruments and Effects Plug-Ins Creators can shape the sonic qualities of their music with over 100 powerful instruments and effects plug-ins in Logic Pro. Effects like vintage EQs compressors and reverbs let users tweak and fine-tune their tracks. With Multi-Touch creators can play instruments using a variety of play surfaces. Logic Pro on iPad comes with a massive collection of realistic-sounding instruments and powerful synths including Sample Alchemy a new sample manipulation instrument that can transform any audio sample with the tap of a finger. Beat Making and Production Tools Producers can chop and flip samples program beats and bass lines and craft custom drum kits with a set of beat making and production tools. Beat Breaker a new time and pitch-morphing plug-in lets music creators swipe and pinch to radically reshape and shuffle sounds. Quick Sampler can chop and transform audio samples into entirely new playable instruments and Step Sequencer lets users program drum patterns bass lines and melodies and even automate plugins with just a few taps. Drum Machine Designer enables the creation of custom drum kits by applying samples and unique plugins to any drum pad. With Live Loops users can capture inspiration and quickly build arrangements by mixing and matching musical loops. Pro Mixer A full-featured mixer complete with channel strips volume faders pan controls plugins sends and preciseautomation gives users everything they need to create a professional mix entirely on iPad. Multi-Touch enables creators to mix intuitively and move multiple faders at once and the mixer meter bridge lets them quickly navigate an overview of track levels all from iPad. Import and Export Logic Pro for iPad supports roundtrip capabilities making it easy to move projects between Logic Pro for Mac and iPad. 6 iPad users can export their finished songs in a variety of compressed and lossless audio formats as well as individual audio track stems. Music creators can make a soundtrack in Logic Pro for iPad and export it into Final Cut Pro for iPad providing incredible flexibility for working across music and video. Logic Pro for iPad also supports the ability to open projects created in GarageBand for iOS so users can take their music to the next level with pro features and workflows. 7 Pricing and Availability Starting Tuesday May 23 Final Cut Pro and Logic Pro for iPad will each be available on the App Store for $4.99 (US) per month or $49 (US) per year with a one-month free trial. 8 Final Cut Pro is compatible with M1 chip iPad models or later and Logic Pro will be available on A12 Bionic chip iPad models or later. Final Cut Pro for iPad and Logic Pro for iPad require iPadOS 16.4. For more information visit apple.com/final-cut-pro-for-ipad and apple.com/logic-pro-for-ipad . Press Contacts Tara Courtney Apple tcourtney@apple.com Apple Media Helpline media.help@apple.com Images in this article Apple revolutionized personal technology with the introduction of the Macintosh in 1984. Today Apple leads the world in innovation with iPhone iPad Mac Apple Watch and Apple TV. Apples five software platforms iOS iPadOS macOS watchOS and tvOS provide seamless experiences across all Apple devices and empower people with breakthrough services including the App Store Apple Music Apple Pay and iCloud. Apples more than 100000 employees are dedicated to making the best products on earth and to leaving the world better than we found it. Tara Courtney Apple tcourtney@apple.com Apple Media Helpline media.help@apple.com The latest news and updates direct from Apple. Read more
1,097
BAD
Apple changed how reading books works in iOS 16 (theverge.com) By Mitchell Clark Apple Books has been my main reading app for years for one very specific reason: its page-turning animation is far and away the best in the business. Unfortunately that went away with iOS 16 and has been replaced by a new animation that makes it feel like youre moving cards through a deck instead of leafing through a digitized version of paper. And despite the fact that Ive been trying to get used to the change since I got onto the beta in July I still feel like Apples destroyed one of the last ways that my phone brought joy into my life. For those unfamiliar with Apples Books app (formerly known as iBooks) Ill try to explain the hole thats suddenly been punched into my reading life. Before iOS 16 the app would play a page-turning animation whenever you tapped or swiped on the left or right edge of your device. It wasnt just a cheap pre-baked animation though; it was one of the pinnacles of the skeuomorphic aesthetic that used to rule Apples mobile OS . The animation is different based on whether you swipe from the top middle or center of the screen and it tracks your finger; if you swipe from the bottom and then move up the page will curl upwards instead of flipping to the side. If you start to swipe then change your mind and move your finger back to the edge the page falls back down unturned. As far as I can tell that experience is completely gone in iOS 16 replaced by an animation that wouldnt feel out of place in a Tinder rip-off or a PDF-viewing utility app. Ive searched through every screen in Books and Settings that I can think of and havent found any way to get the old flipping animation back. The only option Ive found to change the page-turning experience is the one that eliminates it completely by turning the book into a single vertically scrolling page which I somehow find even more offensive than the new animation (though to be clear that was also in the old version of the app as well). Now Im not going to sit here and critique why all the other reading apps Ive used fall short of Apples version in its glory days how they either dont bother including a page flip animation or dont capture the nuances of shadowing and how a real page reacts to your touch. Not because I dont want to of course but because I feel like its better to just show you a series of GIFs so you can see for yourself. 1 / 3 1 / 3 E-reader fans might say that I should be doing my reading on a dedicated device thats not as subject to ever-changing software and I admittedly have found that a physical page turn button scratches the same itch that Apples animation used to even if the transitions on e-ink displays are usually pretty eh. But even if I were to buy a Kobo or Boox or something that wouldnt help me with the dozens of books Ive already purchased on Apples platform. Im sure there are plenty of people thinking that this is an absolutely ridiculous thing to complain about; I can just feel people typing out wow slow news day? comments. And yeah Ill fully admit that this may be a slightly petty article about a very small thing that probably wont matter to very many other people . But it genuinely was a feature that made me choose to buy e-books on Apples platform instead of anyone elses and given how same-y most book stores and reading apps are in the broad strokes it really is the details that get you locked into an ecosystem. The iOS 16 version of the books app does have some genuinely good upgrades but its no longer a joy to use; and to me thats a real shame. / Sign up for Verge Deals to get deals on products we've tested sent to your inbox daily. The Verge is a vox media network 2023 Vox Media LLC. All Rights Reserved
1,099
BAD
Apple considering dropping requirement for iPhone web browsers to use WebKit (macrumors.com) As part of a larger story about Apple's plans to allow third-party app stores on the iPhone and iPad in EU countries Bloomberg 's Mark Gurman claimed that Apple is also considering removing its requirement for iPhone and iPad web browsers to use WebKit the open source browser engine that powers Safari. Gurman said this potential change comes in response to the EU's Digital Markets Act. It's unclear if Apple would drop the requirement in other regions. Currently third-party web browsers including ones like Chrome from Alphabet Inc.'s Google are required to use WebKit Apple's Safari browsing engine wrote Gurman. Under the plan to meet the new law Apple is considering removing that mandate. Gurman did not provide any additional details or a potential timeframe for Apple dropping this requirement but the move would allow for web browsers and in-app browsers on the iPhone and iPad to use alternative browser engines like Google's Blink (used by Chrome and Microsoft Edge) and Mozilla's Quantum (used by Firefox). As a result these browsers would no longer be tied to WebKit's features and development schedule. Chrome Edge Firefox and other popular web browsers like Opera and DuckDuckGo are all currently forced to use WebKit on iOS resulting in limited differentiation with Safari. Apple has previously stated that the WebKit requirement is enforced primarily due to security and privacy considerations but some find the policy anti-competitive. Earlier this year a group of software engineers from the UK and other countries said Apple banning non-WebKit browsers on iOS is deeply anti-competitive and encouraged the company to remove this limitation. The group also complained that Safari has several features unavailable to third-party web browsers on iOS that do use WebKit. Gurman's report also said that Apple is not currently considering adopting the RCS standard alongside iMessage and said that the company is working to provide third-party apps with limited access to the iPhone's NFC chip but he did not elaborate. Get weekly top MacRumors stories in your inbox. A selection of macOS tips to make your Mac life a more effortless experience. A selection of quick iOS tips that will make you a lot more time-efficient in the long run. 50 features and changes you might have missed in macOS Ventura. Apple on May 18 released iOS 16.5 delivering several sports-related enhancements for Apple News a new Pride Celebration wallpaper and several important bug and security fixes. Apple's new AR/VR headset is expected to be unveiled along with iOS 17 macOS 14 and more. Apple's most powerful Mac will finally shift to Apple silicon. Apple's AR/VR headset is coming soon with eye- and gesture-tracking dual 4K displays M-series chips and more. Here's what we know so far. Next-generation version of iOS set to be previewed at WWDC 2023 in June with a public release in September. 5 days ago by Joe Rossignol 6 days ago by Joe Rossignol 6 days ago by Hartley Charlton 1 week ago by Hartley Charlton 1 week ago by Hartley Charlton MacRumors attracts a broad audience of both consumers and professionals interested in the latest technologies and products. We also boast an active community focused on purchasing decisions and technical aspects of the iPhone iPod iPad and Mac platforms.
1,100
BAD
Apple dives into display-making to cut reliance on Samsung (nikkei.com) iPhone maker makes billion-dollar bet on micro-LED screens TAIPEI -- Apple is getting involved in the mass production of next-generation displays to lessen its reliance on rival Samsung and increase its own control over the supply of a key component Nikkei Asia has learned. Taking a hands-on approach to production is a stark contrast to the iPhone maker's usual approach of providing display makers with screen specifications and leaving the actual production to them. Discover the all new Nikkei Asia app Nikkei Asian Review now known as Nikkei Asia will be the voice of the Asian Century. By continuing to browse this website you accept cookies which are used for several reasons such as personalizing content/ads and analyzing how this website is used. Please review our Cookie Policy to learn how you can update your cookie settings.
1,104
BAD
Apple doesnt want you developing hobby apps (bennettnotes.com) Dave Bennett February 2 2023 Apple still charges a $99 yearly developer fee even if you dont want to publish your app on the Appstore! One of my hobbies consist of building apps. Often times these apps are focused on a task thats relevant only to me. For example two years ago I built an Android app that connected to my custom built soil sensor to send myself a notification when my plant was dry. This year I started learning about BLE using the Nordic NRF52 microcontroller. I wanted to make a companion app on my iPhone to provide me with some sensor readings. This is all for education and personal use and not meant to be used by anyone else but myself. To my disappointment one week after uploading the app to my iPhone it no longer opened! Apparently the provisioning certificate that Xcode provides only lasts one week! I would need to rebuild the app through Xcode and upload it to my iPhone every week to continue to use it. Or I could pay the $99 yearly Apple Developer Fee This will provide me the ability to use my app for up to one year on my device. Wait why am I paying for an app thats not being distributed? Dont get me wrong I have no problem with this idea when it comes to apps that I plan to distribute . Im using their servers and their infrastructure to handle updates reviews payments etc. But for apps that just live on my device and nowhere else? This seems pretty greedy. After all I already paid $2.5K for the Macbook and $1.1K for the iPhone. What more do they want? Now some comments on Reddit try to justify this by saying: You not only get access to Xcode you also get access to UIKit and SwiftUI. In addition you get access to all of Apples other Frameworks SDKs So basically I pay a yearly fee to get access to their SDK? Most manufactures provide that to you for free because they really want you to build apps for their platform. Appstore Guidelines are confusing Dont even try publishing your niche hobby application to the Appstore as it might get rejected due to rule 4.2 If your app is not particularly useful unique or app-like it doesnt belong on the App Store. Useful to who? Just myself or useful to a general audience? So if the app is only useful to myself then let me run it on my own device indefinitely without paying a fee to do it! All of this just smells of greed. But then again I guess this is why Apple is a trillion dollar company. I should be thankful they dont make me pay a yearly developer fee for making python apps on my Macbook. Apple still charges a $99 yearly developer fee even if you dont want to publish your app on the Appstore! One of my hobbies consist of building apps. Often times these apps are focused on a task thats relevant only to me. For example two years ago I built an Android app that connected to my custom built soil sensor to send myself a notification when my plant was dry. This year I started learning about BLE using the Nordic NRF52 microcontroller. I wanted to make a companion app on my iPhone to provide me with some sensor readings. This is all for education and personal use and not meant to be used by anyone else but myself. To my disappointment one week after uploading the app to my iPhone it no longer opened! Apparently the provisioning certificate that Xcode provides only lasts one week! I would need to rebuild the app through Xcode and upload it to my iPhone every week to continue to use it. This will provide me the ability to use my app for up to one year on my device. Dont get me wrong I have no problem with this idea when it comes to apps that I plan to distribute . Im using their servers and their infrastructure to handle updates reviews payments etc. But for apps that just live on my device and nowhere else? This seems pretty greedy. After all I already paid $2.5K for the Macbook and $1.1K for the iPhone. What more do they want? Now some comments on Reddit try to justify this by saying: You not only get access to Xcode you also get access to UIKit and SwiftUI. In addition you get access to all of Apples other Frameworks SDKs So basically I pay a yearly fee to get access to their SDK? Most manufactures provide that to you for free because they really want you to build apps for their platform. Dont even try publishing your niche hobby application to the Appstore as it might get rejected due to rule 4.2 If your app is not particularly useful unique or app-like it doesnt belong on the App Store. Useful to who? Just myself or useful to a general audience? So if the app is only useful to myself then let me run it on my own device indefinitely without paying a fee to do it! All of this just smells of greed. But then again I guess this is why Apple is a trillion dollar company. I should be thankful they dont make me pay a yearly developer fee for making python apps on my Macbook. About Bennett Notes Social
1,105
BAD
Apple expands Self Service Repair to Mac notebooks (apple.com) Text of this article August 22 2022 UPDATE Apple expands Self Service Repair to Mac notebooks Genuine Apple parts and tools can be purchased by US customers on August 23 Apple announced Self Service Repair will be available tomorrow for MacBook Air and MacBook Pro notebooks with the M1 family of chips providing repair manuals and genuine Apple parts and tools through the Apple Self Service Repair Store . Self Service Repair for iPhone launched earlier this year and the program will expand to additional countries beginning in Europe as well as additional Mac models later this year. Self Service Repair for MacBook Air and MacBook Pro offers more than a dozen different repair types for each model including the display top case with battery and trackpad with more to come. Customers who are experienced with the complexities of repairing electronic devices will be able to complete repairs on these Mac notebooks with access to many of the same parts and tools available to Apple Store locations and Apple Authorized Service Providers. To start the Self Service Repair process a customer will first review the repair manual for the product they want to repair by visiting support.apple.com/self-service-repair . Then they can visit the Apple Self Service Repair Store and order the necessary parts and tools. Every genuine Apple part is designed and engineered for each product and goes through extensive testing to ensure the highest quality safety and reliability. Customers can send replaced parts back to Apple for refurbishment and recycling and in many cases receive credit of their purchase by doing so. The Apple tools are custom designed to help provide the best repairs for Apple products and are engineered to withstand the rigors of high-volume professional repair operations where safety and reliability are the utmost priority. Apple will offer rental kits for $49 so that customers who do not want to purchase tools for a single repair still have access to these professional repair tools. Customers will have access to the tool kit for one week and it will be shipped free of charge. Self Service Repair is part of Apples efforts to further expand access to repairs. For the vast majority of customers who do not have experience repairing electronic devices visiting a professional repair provider with certified technicians who use genuine Apple parts is the safest and most reliable way to get a repair. Over the past three years Apple has nearly doubled the number of service locations with access to genuine Apple parts tools and training including more than 3500 Independent Repair Providers. A global network of more than 5000 Apple Authorized Service Providers supports more than 100000 active technicians. As a result in the US eight out of 10 Apple customers are located within 20 minutes of an authorized service provider. Press Contacts Apple Media Helpline media.help@apple.com Apple Media Helpline media.help@apple.com The latest news and updates direct from Apple. Read more
1,107
BAD
Apple gets a cut of search revenue from Chrome as part of secret Google deal (9to5mac.com) A new report from The Register today claims that Google is paying Apple a portion of search revenue that comes from Google Chrome on iOS. This perhaps explains why Apple has not launched a rival search engine or invested in the development of its Safari browser to the point that it could become a credible challenger to Chrome the report speculates. Update : This arrangement was also reported by Bloomberg back in 2020: Apple also gets a slice of revenue from searches made through some of Googles own apps such as Chrome installed on iPhones iPads and Macs according to a person familiar with the arrangement who asked not to be named discussing private business agreements One of the largest components of Apples Services revenue is the multibillion-dollar payment it receives from Google every year in exchange for it being the default search engine on iPhone iPad and Mac. While this deal by itself has received regulatory scrutiny todays report from The Register claims that the financial relationship between Google and Apple includes another key component. The report cites a source familiar with the matter and says that Google is paying Apple a portion of search revenue generated by people using Google Chrome on iOS. Its not explicitly clear how much Google is paying Apple as part of this arrangement. The Register says that this revenue-sharing arrangement is part of whats being investigated by the Competition and Markets Authority in the UK. As part of its evidence the report points to some curious redactions in the 365-page report that was published by the CMA last year (emphasis ours): Google pays Apple a share of the search revenue it earns from browser traffic on iOS in the following contexts: in return for being the default search provider on Safari Google pays Apple a share of revenue derived from Safari search traffic; and pursuant to various commercial arrangements Google pays Apple a share of revenue derived from (X) search traffic . Under these agreements Apple receives a significant share of revenue from Google Search traffic on Safari and (X) on iOS devices . Given this revenue share when (X) or Safari is successful in competing for an iOS user rather than winning a full share of the search traffic revenue it only wins a partial share (ie the revenue to which it was not previously entitled). These revenue sharing arrangements therefore dampen incentives for competition between browsers on iOS. The report also points to a December 2021 antitrust lawsuit filed in California and updated in March 2022. The report alleges that Google paid billions of dollars to Apple and agreed to share its profits with Apple to eliminate the threat and fear of Apple as a competitor. When asked about these redactions the CMA reportedly told The Register that it cannot comment on or disclose any confidential information. As for Google and Apple the report says neither corporation would talk when asked to confirm or deny the report. If true this would be a major revelation regarding the relationship between Apple and Google. Its something that would almost undoubtedly face immediate antitrust pressure even beyond the CMA investigation in the UK. If the arrangement in which Google pays Apple to be the default search engine is controversial this revenue-sharing agreement would be even more controversial. Follow Chance : Twitter Instagram and Mastodon FTC: We use income earning auto affiliate links. More. Check out 9to5Mac on YouTube for more Apple news: Breaking news from Cupertino. Well give you t Chance is an editor for the entire 9to5 network and covers the latest Apple news for 9to5Mac. Tips questions typos to chance@9to5mac.com
1,108
BAD
Apple halted M2 chip production in January amid 'plummeting' Mac sales (macrumors.com) Apple halted production of its M2 series chips at the start of 2023 following a pronounced slump in global demand for MacBooks claims a new report out of Korea. Facing plummeting Mac sales amid a severe PC market downturn Apple in January completely suspended production of its custom-designed M2 series processors that power new MacBook Pro and Mac mini models and the latest MacBook Air according to The Elec . Taiwan's TSMC did not send 5 nano-process M2 chip wafer workpieces to the outsourced semiconductor package test (OSAT) in January and February according to the OSAT industry on the 3rd. It is believed that it was because Apple requested to stop production as demand for MacBooks dwindled. As the report notes TSMC finalizes its fabrication process by sending the wafer-processed products to Amkor Technology's Korean factory for packaging work. However in January and February there were reportedly no products forthcoming which also forced the package material parts company for M2 chips to stop supplying the usual materials. The suspension is said to have continued through February after which production of M2 series chips resumed but they were only half the level of the previous year said the Korean-language report quoting a person familiar with the matter. Apple's revenue was down approximately 5% year-over-year in the first quarter a steeper decline than had been expected by analysts as Apple grappled with iPhone supply issues in particular but which also saw declines in Mac and wearables. iPad revenue was up 30% thanks to the launch of new M2 iPad models but Mac revenue saw a notable drop because there were no new Macs released in the final months of 2022. Mac revenue was $7.7 billion down from $10.9 billion in the year-ago quarter. During his Q1 2023 earnings call CEO Tim Cook admitted that Apple faced a challenging situation in the PC market. The industry is contracting said Cook. We have a low share but we have a competitive advantage with Apple silicon so strategically we are well positioned within the market. But I think it will be a little rough in the short term added the Apple chief. Apple unveiled the M2 chip in June 2022 when it appeared in an updated 13-inch MacBook Pro and a new MacBook Air. In January 2023 Apple then added M2 Pro and M2 Max variants to its 14-inch and 16-inch MacBook Pro models and also added M2 and M2 Pro chips to its Mac mini configuration options. Apple's first custom processor the M1 was announced in late 2020 and still powers Apple's 24-inch iMac which was launched in April 2021. Almost a year later the Mac Studio was unveiled with Max and Ultra versions of the same chip. An updated version of the iMac with the M2 chip is seemingly missing from Apple's plans according to Bloomberg 's Mark Gurman and there is no word on if or when a refreshed version of the Mac Studio with M2 Max and M2 Ultra chips will launch. Get weekly top MacRumors stories in your inbox. A selection of macOS tips to make your Mac life a more effortless experience. A selection of quick iOS tips that will make you a lot more time-efficient in the long run. 50 features and changes you might have missed in macOS Ventura. Apple on May 18 released iOS 16.5 delivering several sports-related enhancements for Apple News a new Pride Celebration wallpaper and several important bug and security fixes. Apple's new AR/VR headset is expected to be unveiled along with iOS 17 macOS 14 and more. Apple's most powerful Mac will finally shift to Apple silicon. Apple's AR/VR headset is coming soon with eye- and gesture-tracking dual 4K displays M-series chips and more. Here's what we know so far. Next-generation version of iOS set to be previewed at WWDC 2023 in June with a public release in September. 4 days ago by Joe Rossignol 6 days ago by Joe Rossignol 6 days ago by Hartley Charlton 1 week ago by Hartley Charlton 1 week ago by Hartley Charlton MacRumors attracts a broad audience of both consumers and professionals interested in the latest technologies and products. We also boast an active community focused on purchasing decisions and technical aspects of the iPhone iPod iPad and Mac platforms.
1,112
BAD
Apple has a bug in its disk I/O throttling code that affects raw disk copies (mjacobson.net) Summary : Apple has a bug in its disk I/O throttling code that affects raw disk copies. I walk through my diagnosis. This week I was restoring a slow-as-molasses 2.5-inch HDD to an SSD as part of an upgrade for my mom's work machine. For whatever reason the USB SATA controller I was using seems to limit me to 15 MB/s and 4 kB transfers. Not ideal when I'm transferring 500 gigs but oh well. I can throw it in the background and come back to it later. So after triple-checking for any particularly day-ruining typos I fired off: and went off to do other things on my machine. Over the next few minutes I started noticing my machine behaving strangely. Apps became sluggish; eventually some practically ground to a halt. Some apps showed the spinning pinwheel wait cursor while others stopped updating while remaining responsive to clicks. Some apps displayed nonsensical error alerts. Others apps were ostensibly unaffected and continued working fine. At one point (and very much to my frustration) even Terminal stopped responding. I started by checking the obvious stuff. Memory usage was normal dd was correctly only using a one meg buffer. CPU utilization was high but not outrageously so. dd was reading from one external disk and writing to another so there wasn't any contention for the main disk. It was time to break out spindump to see where things were wedged. spindump is a super-versatile whole-system callstack profiling tool that comes with macOS; it's what macOS uses to generate those You forced Safari to quit. diagnostic reports. Unlike sample it can profile more than one process and it captures the kernel backtrace too. I profiled various sluggish-feeling operations. It was easy to spot a pattern: lots of threads blocked in throttle_lowpri_io specifically threads whose progress was required (directly or indirectly) for the app to make progress. Here's one particularly reproducible example: FaceTime hangs on launch as its main thread tries to call an Objective-C +initialize method; at the same time a secondary thread holding the +initialize lock is stuck in throttle_lowpri_io : In some cases the trail of thread dependencies crosses process boundaries. spindump usually does a good job of telling you what thread to look at next. Here's Messages hanging on launch: To simplify we have this mess: On Darwin threads block inside throttle_lowpri_io when they're being artificially delayed to slow down their I/O operations with the ultimate goal of optimizing the performance of higher-priority I/O. And indeed in both of these cases (and in the other similar problems I saw) the chain of blockage ultimately leads to a thread with less-than-highest I/O priority. Once you start looking it's not hard to find dozens of examples of this just through normal usage of different apps. [1] Why so many UI commands on macOS are blocking on low-priority threads is less a technical question than one of Apple's institutional priorities. (A question that I have lots to say about but not here.) But I was interested in a different question. Ordinarily I/O operations on one disk device don't throttle those on another device. This makes sense because multiple devices can operate independently without contention. Yet in this case all of the throttled I/O operations were to the main disk (containing the root and user data volumes) while my dd was copying from one external disk to another. There was no possibility of contention. So why was throttling still happening? To keep track of which I/Os should be throttled the Darwin kernel maintains what I'll call throttling domains (the source calls them struct _throttle_io_info_t ). In rough terms each throttling domain is meant to correspond one-to-one to a disk device. When an I/O is issued through the spec_strategy routine the kernel has to determine which throttling domain the operation lives in so that the operation may either be throttled or cause throttling of lower-priority operations. The throttling domain is determined first by taking the vnode (i.e. file) the I/O is being done to and walking up to its enclosing mount_t . From there the code looks at the mount's mnt_devbsdunit property. The mnt_devbsdunit describes the disk number of the device the filesystem lives on. If a filesystem is mounted from /dev/disk3 then the mount's mnt_devbsdunit is 3 . If the backing disk is actually a partition of a disk then the number comes from the whole disk not the partition; e.g. /dev/disk3s2 results in 3 . [2] The mnt_devbsdunit which can range from 0 to 63 [3] determines which throttling domain is in play. I find it useful to back up a theory with an example. One good way here is to instrument the kernel with (the recently neglected) dtrace . The following dtrace script triggers on spec_strategy calls and logs out the vnode mount point and computed mdt_devbsdunit : Here's an example of what that script outputs: This says that a process is doing I/O to read or write a file named mom.img which lives in the mountpoint /Volumes/ExternalHDD . Here's what mount(8) tells us about that mountpoint: The 3 in disk3s2 matches the unit=3 in the dtrace output as we expect. But here's a second example: Here the 1 in disk1s1 does not match the unit=0 from dtrace. Why? Apple added a logical volume manager called CoreStorage to Mac OS X Lion. In contrast to traditional disk partitions in which a contiguous range of a disk device is used as a volume CoreStorage allows a looser relationship between volumes and backing storage. For instance a volume might use storage from multiple different disk deviceswitness Fusion Drive for example. This complicates the mnt_devbsdunit situation. Suppose a filesystem is mounted from volume disk2 . According to the previous rules mnt_devbsdunit is 2 . However disk2 might be a CoreStorage logical volume backed by the real disk devices disk0 and disk1 . Moreover CoreStorage might not be the only user of disk0 and disk1 . Suppose further a second non-CoreStorage volume on disk0 called disk0s3 . I/Os to disk2 and disk0s3 may contend with each other. But the mnt_devbsdunit of disk0s3 is 0 so the two mounts will be in different throttling domains. To solve this enter a second mount_t field mnt_throttle_mask . mnt_throttle_mask is a 64-bit bit array. A bit is set only when I/Os to the mount may involve the correspondingly numbered disk device. For our CoreStorage logical volume disk2 since disk0 and disk1 are included bits 0 and 1 are set. Bit 2 is also set for the logical volume itself so the overall mask is 0x7 . In theory you might imagine a system wherein a mount could reside in multiple throttling domains. Or perhaps the throttling domain decision could be pushed down so that CoreStorage could help make smart decisions about which to use for a particular I/O operation. The implemented reality is much more mundane. mnt_devbsdunit is set to the index of the lowest bit set in mnt_throttle_mask . For disk2 since bit 0 is set mnt_devbsdunit is 0. So disk2 and disk0s3 live in the same throttling domain (though notably a theoretical disk1s3 would not). This explains what's happening with /System/Volumes/Data above. disk1s1 is a logical volume presented by a volume manager [4] and its backing storage is on disk0 . Tweaking the dtrace script shows that mnt_throttle_mask is 0x3 : Popping the stack back to the original problem it's now interesting to look at the I/Os done by dd through the lens of the dtrace script. This is what they look like: Notice first that spec_strategy is being asked to do an I/O on the file /dev/disk2 not a regular file as before. Though unusual this makes sense: no filesystem is actually mounted from the disk2 device and dd is explicitly attempting to read the disk2 special file itself. As a side effect the mount point is deduced to be /dev/ . Again this is unusual but it seems to be the least unreasonable option: there is nothing mounted from disk2 itself and /dev/ is the mount point that holds the disk2 special file. This is where things go weird though. dtrace reports the mnt_throttle_mask of /dev/ to be 0x3f . In other words /dev/ claims to be similar to a logical volume made up of exactly these disks: disk0 disk1 disk2 disk3 disk4 and disk5 . Never mind that there are no disk4 or disk5 attached to my system. What on earth would this even mean? /dev/ is the mount point of a synthetic filesystem ( devfs ) of special files. Suresome of those special files do indeed correspond to disk devices. But others correspond to non-disk devices. Still others like /dev/null are completely fabricated by software. And perhaps most curiously why does the list stop at disk5 ? This question is perhaps best answered by imagining what a reasonable value of mnt_devbsdunit would be for devfs. In an ideal world perhaps each vnode in devfs might be assigned a throttling domain independently such that /dev/disk0 lived in the 0 domain /dev/disk1 in the 1 domain etc. Unfortunately the reality of the design allows us to assign only a single mnt_devbsdunit for all of devfs. So a reasonable if far from ideal solution is to assign 63 a value that will put devfs in its own throttling domain as long as fewer than 64 disk devices are attached. Assigning 63 is in fact what the code did prior to Lion. When a mount is created backed by a device vnode are assigned the BSD unit value of the vnode: All others like devfs are assigned a backstop value : Unfortunately when mnt_throttle_mask was introduced in Lion the backstop value was changed to: This seems wrong! First remember that mnt_throttle_mask is a 64-bit bit array whereas mnt_devbsdunit is an ordinal. There are various backstop values that might make sense for mnt_throttle_mask : 0 (all bits cleared) or ~0 (all bits set) are two obvious candidates. But LOWPRI_MAX_NUM_DEV - 1 in other words 63 or 0x3f is not one of them and it's pretty clear that the value was incorrectly copied from the old fallback initialization of mnt_devbsdunit . Second and more importantly the backstop value of mnt_devbsdunit switched from 63 to 0 . 0 is indeed the correct value corresponding to a mask of 0x3f but such a change puts all of devfs in the same throttling domain as disk0 . To preserve the old behavior I'd suggest these backstop values instead be: 1ULL << 63 for the mask andcorrespondinglythe old value of 63 for mnt_devbsdunit . Each disk device is represented with two special files: a block device (e.g. /dev/disk0 like the ones I've been using above) and a character device (e.g. /dev/rdisk0 ). I/O operations to the block device go through the buffer cache and therefore through spec_strategy as outlined above. I/O operations on the character device however bypass the buffer cache and spec_strategy entirely. Notably though I/Os on the character device don't bypass throttling . For I/O on a character device there's special code to determine the correct throttling domain. But since no mount points are involved here the throttling domain is determined based on the vnode alone . This is pretty much exactly what we want (and what we couldn't do when we had to assign a single throttling domain to all of devfs). For some use cases the fact that the character device bypasses the buffer cache could be a problem. Otherwise this seems like the optimal solution. [5] In addition to assigning a priority tier to its I/O operations a process may mark its I/O as passive ; passive I/O may be throttled but doesn't cause throttling of other I/Os. Recompiling dd to call setiopolicy_np(3) would be a hassle. An easier way is to use the taskpolicy(8) modifier utility that comes with recent versions of macOS. Though not documented in the manpage the -d option can take the argument passive like: There are a bunch of sysctls available to tune the behavior of the I/O throttling system including one to shut it off entirely: Amusingly even spindump 's symbolication stepwhich relies on an external daemonsuffers from this kind of problemone which I diagnosed of course another well-timed spindump. To be more concrete the number comes from the DKIOCGETBSDUNIT ioctl implemented for disk devices by IOMediaBSDClient ; the value it returns comes from IOMediaBSDClient::createNodes() . Technically from 0 to LOWPRI_MAX_NUM_DEV . Devices with BSD unit numbers greater than LOWPRI_MAX_NUM_DEV get mapped down using the mod operator so /dev/disk0 and /dev/disk64 share a throttling domain. This probably doesn't come up in practice even if it is a little funky. The APFS volume managerwhich has supplanted CoreStoragein this case. Incidentally this appears to be how the Disk Utility app avoids the problem too.
1,113
BAD
Apple introduces end-to-end encryption for backups (support.apple.com) iCloud uses strong security methods employs strict policies to protect your information and leads the industry in using privacy-preserving security technologies like end-to-end encryption for your data. The security of your data in iCloud starts with the security of your Apple ID. All new Apple IDs require two-factor authentication to help protect you from fraudulent attempts to gain access to your account. Two-factor authentication is also required for many features across Apples ecosystem including end-to-end encryption. Apple offers two options to encrypt and protect the data you store in iCloud: Standard data protection is the default setting for your account. Your iCloud data is encrypted the encryption keys are secured in Apple data centersso we can help you with data recovery and only certain data is end-to-end encrypted. Advanced Data Protection for iCloud is an optional setting that offers our highest level of cloud data security. If you choose to enable Advanced Data Protection your trusted devices retain sole access to the encryption keys for the majority of your iCloud data thereby protecting it using end-to-end encryption. Additional data protected includes iCloud Backup Photos Notes and more. About end-to-end encrypted data End-to-end encrypted data can be decrypted only on your trusted devices where youre signed in with your Apple ID. No one else can access your end-to-end encrypted data not even Apple and this data remains secure even in the case of a data breach in the cloud. If you lose access to your account only you can recoverthisdata using your device passcode or password recovery contact or recovery key . About end-to-end encrypted data End-to-end encrypted data can be decrypted only on your trusted devices where youre signed in with your Apple ID. No one else can access your end-to-end encrypted data not even Apple and this data remains secure even in the case of a data breach in the cloud. If you lose access to your account only you can recoverthisdata using your device passcode or password recovery contact or recovery key . Standard data protection is the default setting for your account.Your iCloud data is encrypted in transit and stored in an encrypted format at rest. The encryption keys from your trusted devices are secured in Apple data centers so Apple can decrypt your data on your behalf whenever you need it such as when you sign in on a new device restore from a backup or recover your data after youve forgotten your password. As long as you can successfully sign in with your Apple ID you can access your backups photos documents notes and more. For additional privacy and security 14 data categories including Health and passwords in iCloud Keychain are end-to-end encrypted. Apple doesn't have the encryption keys for these categories and we can't help you recover this data if you lose access to your account. The table below includes a list of data categories that are always protected by end-to-end encryption. Starting with iOS 16.2 iPadOS 16.2 and macOS 13.1 you can choose to enable Advanced Data Protection to protect the vast majority of your iCloud data even in the case of a data breach in the cloud. With Advanced Data Protection the number of data categories that use end-to-end encryption rises to 23 and includes your iCloud Backup Photos Notes and more. The table below lists the additional data categories that are protected by end-to-end encryption when you enable Advanced Data Protection. If you enable Advanced Data Protection and then lose access to your account Apple will not have the encryption keys to help you recover it youll need to use your device passcode or password a recovery contact or a personal recovery key . Because the majority of your iCloud data will be protected by end-to-end encryption youll be guided to set up at least one recovery contact or recovery key before you turn on Advanced Data Protection. You must also update all your Apple devices to a software version that supports this feature. You can turn off Advanced Data Protection at any time. Your device will securely upload the required encryption keys to Apple servers and your account will once again use standard data protection. Learn how to turn on Advanced Data Protection for iCloud . The table below provides more detail on how iCloud protects your data when using standard data protection or Advanced Data Protection. Some metadata and usage information stored in iCloud remains under standard data protection even when Advanced Data Protection is enabled. For example dates and times when a file or object was modified are used to sort your information and checksums of file and photo data are used to help Apple de-duplicate and optimize your iCloud and device storage all without having access to the files and photos themselves. Representative examples are provided in the table below. This metadata is always encrypted but the encryption keys are still stored by Apple. As we continue to strengthen security protections for all users Apple is committed to ensuring more data including this kind of metadata is end-to-end encrypted when Advanced Data Protection is enabled. With standard data protection iCloud content that you share with other people is not end-to-end encrypted. Advanced Data Protection is designed to maintain end-to-end encryption for shared content as long as all participants have Advanced Data Protection enabled. This level of protection is supported in most iCloud sharing features including iCloud Shared Photo Library iCloud Drive shared folders and shared Notes. iWork collaboration the Shared Albums feature in Photos and sharing content with anyone with the link do not support Advanced Data Protection. When you use these features the encryption keys for the shared content are securely uploaded to Apple data centers so that iCloud can facilitate real-time collaboration or web sharing. This means the shared content is not end-to-end encrypted even when Advanced Data Protection is enabled. To initiate sharing or collaboration the names and Apple IDs of participants are sent to Apple servers and a title and representative thumbnail of the shared item may be used to show a preview to the participants. iCloud.com provides access to your iCloud data via any web browser. All sessions at iCloud.com are encrypted in transit between Apple's servers and the browser on your device. When Advanced Data Protection is enabled access to your data via iCloud.com is disabled by default. You have the option to turn on data access on iCloud.com which allows the web browser that you're using and Apple to have temporary access to data-specific encryption keys provided by your device to decrypt and view your information. Learn more about iCloud.com web access . Third-party app data stored in iCloud is always encrypted in transit and on server. When you turn on Advanced Data Protection third-party app data stored in iCloud Backup and CloudKit encrypted fields and assets are end-to-end encrypted. Both Apple and third-party data centers may be used to store and process your data. When processing data stored in a third-party data center encryption keys are accessed only by Apple software running on secure servers and only while conducting the necessary processing. The keys are always stored and secured in Apple data centers. Apple doesn't access or store keys for any end-to-end encrypted data. Information about products not manufactured by Apple or independent websites not controlled or tested by Apple is provided without recommendation or endorsement. Apple assumes no responsibility with regard to the selection performance or use of third-party websites or products. Apple makes no representations regarding third-party website accuracy or reliability. Contact the vendor for additional information.
1,117
BAD
Apple introduces new version of iMovie featuring Storyboards and Magic Movie (apple.com) Text of this article April 12 2022 UPDATE Apple introduces new version of iMovie featuring Storyboards and MagicMovie iMovie 3.0 makes it easier than ever for anyone to share their stories with video Apple today introduced a new version of iMovie with features that make it easier than ever to create beautiful edited videos on iPhone and iPad. Storyboards helps aspiring content creators and moviemakers learn to edit and improve their video storytelling skills with pre-made templates for popular types of videos shared on social with colleagues or with classmates videos like DIYs cooking tutorials product reviews science experiments and more. Storyboards makes it easy to get started with flexible shot lists and step-by-step guidance on which clips to capture for each video type. For those who want to create a video even faster Magic Movie instantly creates a polished video from the clips and photos a user selects automatically adding transitions effects and music to the edit. Both new features include a range of styles to help personalize the final look and feel of a video including titles filters transitions color palettes and music all designed to work together. iMovie has empowered millions of people around the world to create and share their stories through video said Bob Borchers Apples vice president of Worldwide Product Marketing. This latest version of iMovie featuring Storyboards and Magic Movie simplifies video creation even further and we think it will continue to inspire the next wave of video creators to dive in and get started. Learn and Improve Video Storytelling with Storyboards Storyboards gives users a major head start in the video creation process helping beginners learn to edit and improve their storytelling skills. Instead of starting with a blank timeline creators can choose from 20 different storyboards featuring a variety of popular video types including cooking tutorials Q&As product reviews news reports and more. Each storyboard has a shot list that organizes clips to tell a specific narrative. Each placeholder also includes an illustrative thumbnail along with a tip that asks a question or makes a suggestion for how to shoot that clip to make it more interesting. For added customization shots can be added reordered and deleted from a storyboard as needed. Creators can experiment by applying different video styles that include titles and transitions as well as options for title layouts fonts filters and color palettes. Music tracks dynamically adjust to fit the length of a project and users can refine each clip with intuitive editing tools like trim split clip speed and clip volume. Completed Storyboard videos are easily shared from iMovie through Messages Mail and across social media platforms. Create Videos Instantly with Magic Movie Magic Movie helps create beautiful custom videos complete with titles transitions and music in just a few taps. To create a Magic Movie a user simply selects an album or any group of photos or images from their library and Magic Movie instantly identifies the best parts of the footage and creates the project. Creators can easily put their own spin on a Magic Movie by rearranging or deleting clips in the simplified Magic Movie shot list or by editing the project further. Styles can be added to change up the look and feel of the entire video. Like Storyboard videos Magic Movies can be shared from iMovie via Messages Mail and on social media. Availability iMovie 3.0 including the new Storyboards and Magic Movie features is available today as a free update on the App Store for devices running iOS 15.2 or later and iPadOS 15.2 or later. Press Contacts Apple Media Helpline media.help@apple.com Images in this article Apple Media Helpline media.help@apple.com The latest news and updates direct from Apple. Read more
1,118
BAD
Apple is building a demand-side platform (digiday.com) Apple may have blown up the digital ads business but it left enough fertile ground to build its own more focused play for media dollars. The company is building a demand-side platform if recent job listings are to be believed. Specifically it is looking for a senior manager for a DSP in its ads platforms business. Whoever gets the job will be asked to drive the design of the most privacy-forward sophisticated demand side platform possible per the post. Moreover the ideal candidate would have experience building a mobile-centric DSP and know-how when it comes to optimizing mobile campaigns using measurement and attribution. Experienced heads need only apply with the job ad requesting 8+ years experience of technical experience plus product management along with a proven track record of launching advertising related products for audiences in the hundreds of millions. A DSP is a statement of intent for any ads business let alone one like Apple which has grown exponentially on the back of its decision to make it harder for companies to grow their own within its ecosystem. Indeed a DSP is a core part of an ad tech stack for any company with designs on winning more media dollars. Its technology or more specifically software that lets a marketer advertise with the help of automation. The automation of the process is important because it means marketers can set up campaigns and manage them with relative ease. In turn theyre likely to spend more. Our platform runs and delivers advertising auctions to match supply (customers) with demand (advertisers) focusing on technical components including Campaign Management Bidding Incrementality Dynamic Creative Optimization Matching Auctions and Experimentation while empowering Customer Privacy throughout according to Apples post. It remains unclear if the intended DSP is geared toward serving ads solely on Apples owned and operated properties such as the App Store itself or on the millions of iOS apps or even on third-party properties such as the mobile web. Apple would not confirm its plans to build a DSP or provide additional context as to how this would fit into the overall companys strategy for its ad business. But the job posting is a clear signal of intent for the business that has left the impression (at least publicly) that building ad tech is the last thing it wanted to do. This is a company after all with a business model geared toward kneecapping advertising rather than embracing it. Times change however. For some time its been clear that the idea of Apple being opposed to online advertising might need revising. A DSP settles it. No company builds ad tech like this unless theyre serious about making a bigger move on media dollars. It costs too much time and money to do it on a whim. Over the years Apple has built a vast walled garden of connected products and Apple services said Paulina Klimenko chief growth officer at PubMatic. What connects all these products and services creating seamless consumer experiences is user data. Apple building their own DSP is the next logical step in this evolution. They have been building their advertising business behind the scenes leveraging their scale and ecosystem assets. With its own DSP Apple will also have full control of how and where the data is used preventing leakage of this very valuable data outside of their walled garden. If there was still any doubt about Apples long-term plan for advertising then recent reports should put that to bed once and for all. Sources within Apple a company notoriously shy of making public statements have briefed media outlets with news of more advertising opportunities for those eager to promote their wares in the App Store. The planned ad placements include two additional slots in the App Store with a promotional placement on its Today tab where the paid-for slots will feature alongside editorialized content. The other planned ad placement will feature on app product pages where ads will be served under a tab that reads You Might Also Like. A statement attributed to the proprietor of the App Store reads Apple Search Ads provides opportunities for developers of all sizes to grow their business. It goes on to state Like our other advertising offerings these new ad placements are built upon the same foundation they will only contain content from apps approved App Store product pages and will adhere to the same rigorous privacy standards. In its public posturing Apple has long been an advocate of user privacy to the extent that it has embarked on a years-long project to dismantle the ease with which the marketing industry anonymous middlemen in particular can target those using its wares. This began with the introduction of Intelligent Tracking Prevention which effectively disabled the use of third-party cookies in the Safari web browser and then the rollback of its mobile advertising identifier a.k.a. MAIDs or IDFA in the iOS ecosystem. More recently Apple began to usher in a series of tools such as its App Tracking Transparency tools which it claimed placed the user in control of any ad experience by opting in or out of any tracking etcetera. For some sources all of whom requested anonymity given Apples reluctance to make public its plans before it is ready there has been a notable uptick in the App Store providers efforts to recruit talent to its Ad Platform unit since the beginning of 2022. Revenues generated by this unit are usually included in the services revenue of its earningsdisclosures alongside Apple TV+ Music or games with the division generating $19.6 billion during the June quarter of this year . The debut of the new advertising opportunities in the App Store and attempts to recruit a team to build a DSP emerged within a month of the Cannes Lions International Festival of Creativity. Here Apple was said to have had an unprecedented presence including discreet facilities to host media executives according to several sources. Prompt engineering is an emerging skill and one companies are looking to hire for as they employ more AI tools. And yet dedicated prompt engineering roles may be somewhat short-lived as workforces become more proficient in using the tools. Future plc had a slow start to its fiscal year with overall revenue flat. Execs outlined plans to grow its U.S. business. Dealmakers are tipping an imminent flow of deals... but a few things have to happen first. Contenido Patrocinado por Google.Este artculo se public originalmente en septiembre 9 de 2021. Kelly Twohig directora del sector sanitario de Google Es un estereotipo perezoso que suele reproducirse para hacer rer: una persona mayor desconcertada que intenta averiguar cmo encender un anticuado ordenador de sobremesa. Resulta que es tan errneo como anticuado. Me encanta YouTube! [] Publishers first quarter earnings reports were a mixed bag when looking at overall revenue and subscriptions remained a bright spot. But digital advertising was painfully down year over year. The update applies to publishers within the European Economic Area and U.K. that want to use tools such as Google AdSense Ad Manager or AdMob. Get access to tools and analysis to stay ahead of the trends transforming media and marketing Visit your account page to make changes and renew. Get Digiday's top stories every morning in your email inbox. Follow @Digiday for the latest news insider access to events and more.
1,119
BAD
Apple is discontinuing the iPod (apple.com) Text of this article May 10 2022 UPDATE The music lives on iPod touch will be available while supplies last Since its introduction over 20 years ago iPod has captivated users all over the world who love the ability to take their music with them on the go. Today the experience of taking ones music library out into the world has been integrated across Apples product line from iPhone and Apple Watch to iPad and Mac along with access to more than 90 million songs and over 30000 playlists available via Apple Music. Music has always been part of our core at Apple and bringing it to hundreds of millions of users in the way iPod did impacted more than just the music industry it also redefined how music is discovered listened to and shared said Greg Joswiak Apples senior vice president of Worldwide Marketing.Today the spirit of iPod lives on. Weve integrated an incredible music experience across all of our products from the iPhone to the Apple Watch to HomePod mini and across Mac iPad and Apple TV. And Apple Music delivers industry-leading sound quality with support for spatial audio theres no better way to enjoy discover and experience music. Among the incredible ways to enjoy music across a range of devices including a wide variety of models from the new iPhone SE to the latest iPhone 13 Pro Max iPhone is the best device for streaming Apple Music or storing an entire music library on the go. Apple Watch and AirPods are the perfect companion allowing users to access over 90 million songs right from their wrist starting at just $279 with Apple Watch SE. iPad starts at just $329 comes with a more powerful chip larger display and the latest iPadOS features. And for the best way to enjoy music at home HomePod mini is just $99. Customers can purchase iPod touch through apple.com Apple Store locations and Apple Authorized Resellers while supplies last. Press Contacts Apple Media Helpline media.help@apple.com Images in this article Apple Media Helpline media.help@apple.com The latest news and updates direct from Apple. Read more
1,120
BAD
Apple is not defending browser engine choice (infrequently.org) This is part six of the seven-part series Browser Choice Must Matter Gentle reader I made a terrible mistake. Yes that's right: I read the comments on a MacRumors article . At my age one knows better. And yet. As penance for this error and for being short with Miguel I must deconstruct the ways Apple has undermined browser engine diversity. Contrary to claims of Apple partisans iOS engine restrictions are not preventing a takeover by Chromium at least that's not the primary effect. Apple uses its power over browsers to strip-mine and sabotage the web hurting all engine projects and draining the web of future potential. As we will see both the present and future of browser engine choice are squarely within Cupertino's control. Apple's Long-Standing Policies Are Anti-Diversity WebKit couldn't compete if it had to. Browsers Are Big Business WebKit Is No Charity Choices Choices Recent Developments How Apple Gutted Mozilla's Chances Back Of The Napkin The Best Kind Of Correct What Now? A refresher on Apple's iOS browser policies: Defenders of Apple's monopoly offer hard-to-test claims but many boil down to the idea that Apple's product is inferior by necessity. This line is frankly insulting to the good people that work on WebKit. They're excellent engineers; some of the best pound for pound but there aren't enough of them. And that's a choice. Nobody frames it precisely this way; instead they'll say if WebKit weren't mandated Chromium would take over or Google would dominate the web if not for the WebKit restriction. That potential future requires mechanisms of action something to cause Safari users to switch. What are those mechanisms? And why are some commenters so sure the end is nigh for WebKit? Past swings away from OS default browsers have hinged on the new features better performance improved security and superior site compatibility. Marketing and distribution play a prominent role but have been indecisive in recent browser battles. The leads of OS incumbents are not insurmountable because browsers are commodities with relatively low switching costs. Better products tend to win. Apple's prohibition on iOS browser engine competition has drained the potential of browser choice to deliver improvements. Without the ability to differentiate on features security performance privacy and compatibility what's to sell? A slightly different UI ? That's meaningful but identically feeble web features cap the potential of every iOS browser. Nobody can pull ahead and no product can offer future-looking capabilities that might make the web a more attractive platform. This is working as intended: Of all the reasons to switch browsers compatibility is often the most compelling. Major sites asking users to switch is incredibly effective in aggregate. Compatibility describes both a browser's ability to display existing content and developers' ability to rely on a set of features across browsers. Standards support is a sub-point of this latter issue but acts as a trailing indicator of engine quality. [1] On OS es with browser competition sites can recommend browsers with engines that cost less to support or unlock crucial capabilities. However developers are loathe to do this; turning away users isn't a winning growth strategy and prompting visitors to switch is pass . Still in extremis missing features and the parade of showstopping bugs render some services impossible to deliver. In these cases suggesting an alternative beats losing users entirely. But what if there's no better alternative? This is the situation that Apple has engineered on iOS. Cui bono? who benefits? All iOS browsers present as Safari to developers. There's no point in recommending a better browser because none is available. The combined mass of all iOS browsing pegged to the trailing edge means that folks must support WebKit or decamp for Apple's App Store where it hands out capabilities like candy but at a shocking price . iOS's mandated inadequacy has convinced some that when engine choice is possible users will stampede of away from Safari. This would in turn cause developers to skimp on testing for Apple's engine making it inevitable that browsers based on WebKit and other minority engines could not compete. Or so the theory goes. But is it predestined? Perhaps some users will switch but browser market changes take a great deal of time and Apple enjoys numerous defences. To the extent that Apple wants to win developers and avoid losing users it has plenty of time. It took over five years for Chrome to earn majority share on Windows with a superior product and there's no reason to think iOS browser share will move faster. Then there's the countervailing evidence from macOS where Safari manages to do just fine. Regulatory mandates about engine choice will also take more than a year to come into force giving Apple plenty of time to respond and improve the competitiveness of its engine. And that's the lower bound. Apple's pattern of malaicious compliance will likely postpone true choice even futher. As Apple fights tooth-and-nail to prevent alternative browser engines it will try to create ambiguity about vendor's ability to ship their best products worldwide potentially delaying high-cost investment in ports with uncertain market reach. Cupertino may also try to create arduous processes that force vendors to individually challenge the lack of each API one geography at a time. In the best case time will still be lost to this sort of brinksmanship. This is time that Apple can use to improve WebKit and Safari to be properly competitive. Why would developers recommend alternatives if Safari adds features improves security prioritises performance and fumigates for showstopping bugs? Remember: developers don't want to prompt users to switch; they only do it under duress. The features and quality of Safari are squarely in Apple's control. So given that Apple has plenty of time to catch up is it a rational business decision to invest enough to compete? Browsers are both big business and industrial-scale engineering projects. Hundreds of folks are needed to implement and maintain a competitive browser with specialisations in nearly every area of computing. World-class experts in graphics networking cryptography databases language design VM implementation security usability (particularly usable security) power management compilers fonts high-performance layout codecs real-time media audio and video pipelines and per- OS specialisation are required. And then you need infrastructure; lots of it. How much does all of this cost? A reasonable floor comes from Mozilla's annual reports. The latest consolidated financials (PDF) are from 2020 and show that without marketing expenses Mozilla spends between $380 and $430 million US per year on software development. Salaries are the largest category of these costs (~$180-210 million) and Mozilla economises by hiring remote employees without offering large bonuses and stock-based compensation. From this data we can assume a baseline cost to build and maintain a competitive cross-platform browser at $450 million per year. Browser vendors fund their industrial-scale software engineering projects through integrations. Search engines pay browser makers for default placement within their products. They in turn make a lot of money because browsers send them transactional and commercial intent searches as part of the query stream. Advertisers bid huge sums to place ads against keywords in these categories. This market in turn funds all the R&D and operational costs of search engines including traffic acquisition costs like browser search default deals. [2] How much money are we talking about? Mozilla's $450 million in annual revenue comes from approximately 8% of the desktop market and negligible mobile share. Browsers are big big business. Despite being largely open source browsers and their engines are not loss leaders. Safari in particular is wildly profitable. The New York Times reported in late 2020 that Google now pays Apple between $8-12 billion per year to remain Safari's default search engine up from $1 billion in 2014. Other estimates put the current payments in the $15 billion range . What does this almighty torrent of cash buy Google? Searches preferably of the commercial intent sort. Mobile accounts for two-thirds of web traffic (or thereabouts ) making outsized iOS adoption among wealthy users particularly salient to publishers and advertisers. Google's payments to Apple are largely driven by the iPhone rather than its niche desktop products where effective browser competition has reduced the influence of Apple's defaults. Even with Apple's somewhat higher salaries per engineer the skeleton staffing of WebKit combined with the easier task of supporting fewer platforms suggests that Apple is unlikely to spend considerably more than Mozilla does on browser development. In 2014 Apple would have enjoyed a profit margin of 50% if it had spent half a billion on browser engineering. Today that margin would be 94-97% depending on which figure you believe for Google's payments. In absolute terms that's more profit than Apple makes selling Macs. Compare Cupertino's 3-6% search revenue reinvestment in the web with Mozilla's near 100% commitment then recall that Mozilla has consistently delivered a superior engine to more platforms. I don't know what's more embarrassing: that some folks argue with a straight face that Apple is trying hard to build a good browser or that it is consistently overmatched in performance security and compatibility by a plucky non-profit foundation that makes just ~5% of Apple's web revenue. Steve Jobs launched Safari for Windows in the same WWDC keynote that unveiled the iPhone. Commenters often fixate on the iPhone's original web-based pitch but don't give Apple stick for reducing engine diversity by abandoning Windows three versions later . Today Apple doesn't compete outside its home turf and when it has agency it prevents others from doing so. These are not the actions of a firm that is consciously attempting to promote engine diversity. If Apple is an ally in that cause it is only by accident. Theories that postulate a takeover by Chromium dismiss Apple's power over a situation it created and recommits to annually through its budgeting process. This is not a question of resources. Recall that Apple spends $85 billion per year on stock buybacks [3] $15 billion on dividends enjoys free cash flow larger than the annual budgets of 47 nations and retain tens of billions of dollars of cash on hand . [4] And that's to say nothing of Apple's $100+ billion in non-business-related long-term investments. Even if Safari were a loss leader Apple would be able to avoid producing a slower stifled less secure famously buggy engine without breaking the bank. Apple needs fewer staff to deliver equivalent features because Safari supports fewer OS es. The necessary investments are also R&D expenses that receive heavy tax advantages . Apple enjoys enviable discounts to produce a credible browser but refuses to do so . Unlike Microsoft's late and underpowered efforts with IE 7-11 Safari enjoys tolerable web compatibility more than 90% share on a popular OS and an unheard-of war chest with which to finance a defence. The postulated apocalypse seems far away and entirely within Apple's power to forestall. One way to understand the voluntary nature of Safari's poor competitiveness is to put Cupertino's recent burst of effort in context. When regulators and legislators began asking questions in 2019 a response was required. Following Congress' query about default browser choice Apple quietly allowed it through iOS 14 ( however ham-fistedly ) the following year. This underscores Apple's gatekeeper status and the tiny scale of investment required to enable large changes. In the past six months the Safari team has gone on a veritable hiring spree . This month's WWDC announcements showcased returns on that investment. By spending more in response to regulatory pressure Apple has eviscerated notions that it could not have delivered a safer more capable and competitive browser many years earlier. Safari's incremental headcount allocation has been large compared to the previous size of the Safari team but in terms of Apple's P&L it's loose change. Predictably hiring talent to catch up has come at no appreciable loss to profitability . The competitive potential of any browser hinges on headcount and Apple is not limited in its ability to hire engineering talent. Recent efforts demonstrate that Apple has been able to build a better browser all along and year after year chose not to. For over a dozen years setting any browser other than Safari as the iOS default was impossible. This spotted Safari a massive market share head-start. Meanwhile restrictions on engine choice continue to hamstring competitors removing arguments for why users should switch. But don't take my word for it; here's the recent UK CMA Final Report on Mobile Ecosystems summarising submissions by Mozilla and others (pages 154-155): 5.48 The WebKit restriction also means that browser vendors that want to use Blink or Gecko on other operating systems have to build their browser on two different browser engines. Several browser vendors submitted that needing to code their browser for both WebKit and the browser engine they use on Android results in higher costs and features being deployed more slowly. 5.49 Two browser vendors submitted that they do not offer a mobile browser for iOS due to the lack of differentiation and the extra costs while Mozilla told us that the WebKit restriction delayed its entrance into iOS by around seven years That's seven years of marketing feature iteration and brand loyalty that Mozilla sacrificed on the principle that if they could not bring their core differentiator there was no point. It would have been better if Mozilla had made a ruckus rather than hoping the world would notice its stoic virtue but thankfully the T-rex has roused from its slumber . Given the hard times the Mozilla Foundation has found itself in it seems worth trying to quantify the costs. To start Mozilla must fund a separate team to re-develop features atop a less-capable runtime. Every feature that interacts with web content must be rebuilt in an ad-hoc way using inferior tools . Everything from form autofill to password management to content blocking requires extra resources to build for iOS. Not only does this tax development of the iOS product it makes coordinated feature launches more costly across all ports. Most substantially iOS policies against default browser choice combined with in-app-browser and search entry point shenanigans have delayed and devalued browser choice. Until late 2020 users needed to explicitly tap the Firefox icon on the home screen to get back to their browser. Navely tapping links would instead load content in Safari. This split experience causes a sort of pervasive forgetfulness making the web less useful. Continuous partial amnesia about browser-managed information is bad for users but it hurts browser makers too. On OSes with functional competition convincing a user to download a new browser has a chance of converting nearly all of their browsing to that product. iOS (along with Android and Facebook's mobile apps ) undermine this by constantly splitting browsing ignoring the user's default. When users don't end up in their browser searches occur through it less often affecting revenue. Web developers also experience this as a reduction in visible share of browsing from competing products reducing incentives to support alternative engines. A foregetful web also hurts publishers. Ad bid rates are suppressed and users struggle to access pay-walled content when browsing is split. The conspicuious lack of re-engagement features like Push Notifications are the rotten cherry on top forcing sites to push users to the App Store where Apple doesn't randomly log users out or deprive publishers of key features. Users browser makers web developers and web businesses all lose. The hat-trick of value destruction. The pantomime of browser choice on iOS has created an anaemic amnesiac web. Tapping links is more slogging than surfing when autofill fails passwords are lost and login state is forgotten. Browsers become less valuable as the web stops being a reliable way to complete tasks. Can we quantify these losses? Estimating lost business from user frustration and ad rate depression is challenging. But we can extrapolate what a dozen years of choice might have meant for Mozilla from what we know about how Apple monetises the web. For the purposes of argument let's assume Mozilla would be paid for web traffic at the same rate as Apple; $8-15 billion per year for ~75% share of traffic from Apple OS es. If the traffic numbers to US government websites are reasonable proxies for the iOS/macOS traffic mix (big ifs) then equal share for Firefox on iOS to macOS would be worth $215-400 million per year . [5] Put differently; there's reason to think that Mozilla would not have suffered layoffs if Apple were an ally of engine choice. Apple's policies have made the web a less compelling ecosystem its anti-competitive behaviour have driven up costs for browser makers and it has simultaneously starved them of revenue. If Apple are friends of engine diversity who needs enemies? There is a narrow fetid sense in which Apple's influence is nominally pro-diversity. Having anchored a significant fraction of web traffic at the trailing edge businesses that do not decamp for the App Store may feel obliged to support WebKit. This is a malignant form of diversity not unlike other lagging engines through the years that harmed users and web-based businesses by externalizing costs. But on OSes with true browser choice alternatives were meaningful. Consider the loathed memory of IE 6 a browser that overstayed its welcome by nearly a decade. For as bad as it was folks could recommend alternatives. Plugins also allowed us to transparently upgrade the platform . Before the rise open-source engines the end of one browser lineage may have been a deep loss to ecosystem diversity but in the past 15 years the primary way new engines emerge has been through forks and remixing. But the fact of an engine being different does not make that difference valuable and WebKit's differences are incremental. Sure Blink now has a faster layout engine better security more features and fewer bugs but like WebKit it is also derived from KHTML . Both engines are forks and owe many present-day traits to their ancestors. Today's KHTML descendants are not the end of the story. Future forks are possible. New codebases can be built from parts. Indeed there's already valuable cross-pollination in code between Gecko WebKit and Chromium. Unlike the '90s and early 2000s diversity can arrive in valuable increments through forking and recombination. What's necessary for leading edge diversity however is funding. By simultaneously taking a massive pot of cash for browser-building off the table returning the least it can to engine development and preventing others from filling the gap Apple has foundationally imperilled the web ecosystem by destroying the utility of a diverse population of browsers and engines. Apple has agency. It is not a victim and it is not defending engine diversity. A better brighter future for the web is possible and thanks to belated movement by regulators increasingly likely. The good folks over at Open Web Advocacy are leading the way clearly explaining to anyone who will listen both what's at stake and what it will take to improve the situation. Investigations are now underway worldwide so if you think Apple shouldn't be afraid of a bit of competition if it will help the web thrive consider getting involved. And if you're in the UK or do business there consider helping the CMA help the web before July 22nd 2022. The future isn't written yet and we can change it for the better. Many commenters come to debates about compatibility and standards compliance with a mistaken view of how standards are made . As a result they perceive vendors with better standards conformance (rather than content compatibility) to occupy a sort of moral high ground. They do not. Instead it usually represents a broken standards-setting process. This can happen for several reasons. Sometimes standards bodies shutter and the state of the art moves forward without them. This presents some risk for vendors that forge ahead without the cover of an SDO 's protective IP umbrella but that risk is often temporary and measured. SDO s aren't hard to come by; if new features are valuable they can be standardised in a new venue. Alternatively vendors can renovate the old one if others are interested in the work. More often working groups move at the speed of their most obstinate participants uncomfortably prolonging technical debates already settled in the market and preventing definitive documentation of the winning design. In other cases a vendor may play games with intellectual property claims to delay standardisation or lure competitors into a patent minefield (as Apple did with Touch Events). At the leading edge vendors need space to try new ideas without the need for the a priori consensus represented by a standard. However compatibility concerns expressed by developers take on a different tinge over time. When the specific API details and capabilities of ageing features do not converge a continual tax is placed on folks trying to build sites using features from that set. When developers stress the need for compatibility it is often in this respect. Disingenuous actors sometimes try to misrepresent this interest and claim that all features must become standards before they are introduced in any engine. This interpretation runs against the long practice of internet standards development and almost always hides an ulterior motive. The role of standards is to consolidate gains introduced at the leading edge through responsible competition. Vendors that fail to participate constructively in this process earn scorn. They bring ignominy upon their houses by failing to bring implementations in line with the rough (documented and tested) consensus or by playing the heel in SDO s to forestall progress they find inconvenient. Vendors like Apple. In the financial reports of internet businesses you will see the costs to acquire business through channels reported as Traffic Acquisition Costs or TACM . Many startups report their revenue excluding TAC or ex- TAC . These are all ways of saying we paid for lead generation and search engines are no different. This is money Apple believes it cannot figure out a way to invest in its products. That's literally what share buybacks indicate. They're an admission that a company is not smart enough to invest the money in something productive. Buybacks are attractive to managers because they create artificial scarcity for shares to drive up realised employee compensation their own included. Employees who are cheesed to realise that their projects are perennially short-staffed are encouraged not to make a stink through RSU appreciation. Everyone gets a cut RSU -rich managers most of all. Different analysts use different ways of describing Apple's cash on hand . Some analysts lump in all marketable securities current and non-current which consistently pushes the number north of $150 billion. Others report only the literal cash value on the books ($34 billion as of May 2020 ). All of this means that it can require more context to compare the numbers in Apple's consolidated financial statements (PDF) with public reporting on them. The picture is also clouded by changes in the way Apple manages its cash horde. Over the past two years Apple has begun to draw from this almighty pile of dollars and spend more to inflate its stock price through share buybacks and dividends. This may cast Apple as more cash-poor than it is. A better understanding of the actual situation is derived from free cash flow . Perhaps Apple will continue to draw down from its tall cash mountain to inflate its stock price via buybacks but that's not a material change in the amount Apple can potentially spend on improving its products. Since this post first ran several commenters have noted a point I considered while writing but omitted in order to avoid heaping scorn on a victim; namely that Mozilla's management has been asleep at the switch regarding the business of its business. Historically when public records were available for both Opera and Mozilla it was easy to understand how poorly Mozilla negotiated with search partners. Under successive leaders Mozilla negotiated deals that led to payments less than as half as much per point of share. There's no reason to think MoCo's negotiating skills have improved dramatically in recent years. Apple therefore is likely to caputre much more revenue per search than an install of Firefox. But even if Mozilla only made 1/3 of Apple's haul for equivalent use the combined taxes of iOS feature re-development and loss of revenue would be material to the Mozilla Foundation's bottom line. Obviously to get that share Mozilla would need to prioritise mobile which it has not done. This is a deep own-goal and a point of continued sadness for me. A noble house reduced to rubble is a tragedy no matter who demolishes the final wall. Management incompetence is in evidence and Mozilla's Directors are clearly not fit for purpose. But none of that detracts from what others have done to the Foundation and the web and it would be just as wrong to claim Mozilla should have been perfect in ways its enemies and competitors were not. Next: The Performance Inequality Gap 2023 Previously: A Management Maturity Model for Performance
1,121
BAD
Apple is quietly pushing a TV ad product with media agencies (digiday.com) Apples growing designs on the ad industry are already well-documented with the iPhone makers media unit already generating $4 billion per year on search. But now it would appear it is further stepping up its efforts by serving ads on even more screens. Apple Digiday has learned is preparing a more serious push into monetizing its original video content with an ad play according to several media agency sources that held separate exploratory discussions with the digital giant. More are due to follow later this year according to the media agencies executives Digiday spoke with all of whom requested to speak on background due to the sensitive nature of the discussions and Apples insistence on privacy. While Apple did not respond to Digidays request for comment on the nature of its meetings with holding company executives separate sources noted that Apple led by vp of ad platforms Todd Teresi has been making the rounds with network executives. During these meetings Teresi described what the agency sources characterized as an unusual approach to selling ad time on Apple TV. One source said that a few months ago Apple was talking about an ad sales approach that more closely resembled selling search than a traditional audience-based pitch that most TV sellers take to market. Apple TV executives are said to have talked about building an API-based platform in a similar way to how Google or Microsoft search is bought. Their view which they clearly articulated is they dont believe in clean rooms they dont believe in any use of data said the source. At that time they were very anti ever working with a [demand side platform]. And they also were saying they dont have the capability to build one. That approach has since changed Digiday reported in August that Apple is building a DSP . And another holding company executive who spoke more recently with Apple confirmed that the company is planning to use a DSP for their TV inventory. They actually have so much native inventory through maps email and all their apps at this point that the DSP would actually be only to decision on their owned and operated [TV content] said the executive. Apple TV has already been carrying ads on its coverage of Major League Baseball games but those ads have been sold by the league itself not Apple. But theres no shortage of original programming including the multiple-Emmy winner Ted Lasso on which to run ads. The question is when will it start? One agency source said Apple was not asking for that media agency network to put aside client dollars in the fourth quarter so it seems more likely to be in early 2023. Anything that would open up premium inventory [in connected TV or streaming] would be a positive said the exec. The last bastion is Apple TV. Apple is going to be a very good ad experience with probably a low ad load. Theyre already actually very diversified in terms of revenue streams so theres less pressure to fit lots of ads. A third holding company executive also met in summer with Apples Teresi who talked about ad opportunities around the sports programming (which includes a 10-year deal with Major League Soccer ) but this executive said the discussion didnt address an ad-supported tier on Apple TV. The executive did say Teresi was looking to meet again with this holding company in November and that executive speculated an ad-supported tier could come up then. Teresi did express interest in expanding their [ad] business but he also made it very clear that Apple will do nothing unless one theres a significant opportunity and two they would never do anything to jeopardize the relationship they have with their consumers said the executive. Apple doesnt break out the advertising revenue generated by its existing ads business which predominantly consists of selling search ads on the App Store as well as a tie-up with NBCUniversal that lets its sales team sell inventory on apps such as Apple News. Although it is thought to be in the region of $4 billion per year and estimated to rise to $6 billion according to analysts at JP Morgan with Counterpoint Research estimating that iPhones market share overtook Android in the U.S. where it now has a 50% share during June. Meanwhile in the month prior to Digiday revealing Apples ambition to launch a DSP a development that would require a tactful introduction to the market given its earlier proclamations around privacy it also unveiled new search ad placements in the App Store. Veterans of the sector will recall how the latest Madison Avenue charm offensive is indicative of it iAd rollout more than a decade ago a launch that largely coincided with the early rollout of the iPad. According to sources this was a premium offering that was impressive but overpriced and launched in a manner that was largely out of sync with market sentiment. For some iAd was ahead of its time but in particular many ad buyers thought that Apple didnt permit its sales staff with the humility and flexibility necessary to make it as a vendor on Madison Avenue hence it was effectively wound up in 2016. Speaking on the emergence of Apples programmatic aspirations in August Rob Norman an advisor to several ad tech startups and longtime GroupM veteran noted how iAd was a first class mobile experience but that there was little evidence of its effectiveness. For the clients that did go for it [iAd] it was very high-touch in terms of support with creative and everything else that goes on with a campaign he added and Apple was never staffed to be a high-tough support business for advertisers. Rachel Pasqua now CMO of Prohaska Consulting led the specialist mobile team at iCrossing during the early rollout of iAd and noted how Apples conduct during this time one where Google was also promoting its mobile first mantra demonstrated that it was unaware of how media is a very different beast to the tech business. Of course it was a premium product but I think Apple was used to being able to name their price with no questions asked but media is not that way she said. As questions will always be asked in media their take was very much This is a premium product from Apple and you will pay our price. Obviously that didnt go over well with the media community added Pasqua who (like several other sources) went on to note her opinion that iAd sales executives were allowed to deviate little from their prescribed sales decks. However as it continues to court the C-suites of Madison Avenue the exploratory nature and collaborative tone of the discussions suggest that Apple has learned lessons from its foray into adland during the early 2010s as it seeks to further its mobile dominance and enter the realms of prime time TV ad sales. Prompt engineering is an emerging skill and one companies are looking to hire for as they employ more AI tools. And yet dedicated prompt engineering roles may be somewhat short-lived as workforces become more proficient in using the tools. Future plc had a slow start to its fiscal year with overall revenue flat. Execs outlined plans to grow its U.S. business. Dealmakers are tipping an imminent flow of deals... but a few things have to happen first. Contenido Patrocinado por Google.Este artculo se public originalmente en septiembre 9 de 2021. Kelly Twohig directora del sector sanitario de Google Es un estereotipo perezoso que suele reproducirse para hacer rer: una persona mayor desconcertada que intenta averiguar cmo encender un anticuado ordenador de sobremesa. Resulta que es tan errneo como anticuado. Me encanta YouTube! [] Publishers first quarter earnings reports were a mixed bag when looking at overall revenue and subscriptions remained a bright spot. But digital advertising was painfully down year over year. The update applies to publishers within the European Economic Area and U.K. that want to use tools such as Google AdSense Ad Manager or AdMob. Get access to tools and analysis to stay ahead of the trends transforming media and marketing Visit your account page to make changes and renew. Get Digiday's top stories every morning in your email inbox. Follow @Digiday for the latest news insider access to events and more.
1,122
BAD
Apple is top funder of lobby group that says it represents small developers (arstechnica.com) Front page layout Site theme Jon Brodkin - Sep 19 2022 5:52 pm UTC Apple provides over half the funding for a lobby group that says it represents small app developers according to a Bloomberg article published today. Apple isn't a member of what's called the App Association but it plays a dominant behind-the-scenes role shaping the group's policy positions according to four former App Association employees who asked not to be named discussing internal matters Bloomberg wrote. The App Association says it gives a voice to small technology companies and that its policy priorities reflect the opportunities and challenges today's small business app developers and IoT innovators face in the app ecosystem. But its positions on major legislation have aligned with Apple's. The group's list of policy statements going back to early 2017 include some specifically praising Apple and others opposing legislation that Apple also opposes such as antitrust bills targeting Big Tech. One bill opposed by the App Association is the Open App Markets Act which aimed to help app developers use alternative in-app payment systems and avoid Apple's standard cuts of 15 to 30 percent . The Apple-funded group also opposed the American Innovation and Choice Online Act which would have prohibited app stores and other large online platforms from giving preference to their own products at the expense of competitors. Both bills have stalled in the face of Big Tech opposition . According to Bloomberg the App Association (also known as ACT) confirmed that over half its funding comes from Apple but former employees say the actual percentage is much higher. The group's total funding was over $9 million in 2020. ACT representatives regularly testify in Congress file court briefs in defense of Apple's positions and host annual 'y-in' meetings for developers with lawmakers Bloomberg wrote. An App Association spokesperson told Ars that in 2020 Apple's commitment contributed more than 50 percent of the App Association's sponsorship revenue making their support one of many contributors for the year. The App Association did not answer our question about what role Apple plays in shaping its policy. The group also told Ars it is proud to represent thousands of independent developers to foster an inclusive and secure developer ecosystem and app marketplace. Our members drive the organization's policy and legislative agenda. As per our preliminary 2020 [Form] 990 filings the organization saw an increase in sponsorship revenue to support our small business members' advocacy efforts such as privacy broadband and in helping our members navigate the early stages of the pandemic including connected health workforce maintenance/development and access to government funding for ongoing business operations the App Association said. The App Association's website lists about two dozen employees and says the group represents more than 5000 app makers and connected device companies in the mobile economy. The members are located around the world in all 27 member countries of the European Union and in all 435 congressional districts of the United States. The App Association told Ars that its full member list is proprietary but 38 that agreed to be named publicly are listed here . US-based members include Concentric Sky Startup Health SheerID Dogtown Media Wellbeyond Stroll Health Project Hosts Colorado Technology Consultants MotionMobs Rimidi Southern DNA Devscale BadVR CannedSpinach BitSource SentryOne and AirStrip. We contacted Apple today and will update this article if we get a response. According to Bloomberg ACT President Morgan Reed and other ACT executives said that they determine policy positions based on the preferences of their members and don't take direction from Apple though they take Apple's positions into account. Reed told Bloomberg in an interview that it 'doesn't pass the laugh test' to say the association is fronting for Apple. Our job is to make sure we're paying attention to the way that government can have an impact unintended or otherwise on all of those small businesses making cool software products he said. Another group called the Coalition for App Fairness whose members include Basecamp Deezer Epic Games Match Group Proton Spotify and others has lobbied in favor of Big Tech antitrust legislation. For example the group argued that the American Innovation and Choice Online Act would bar monopolistic platforms from discriminating among business users in a way that materially harms competition. The Coalition for App Fairness criticized the App Association in a tweet today. An association that is funded primarily by Apple and represents Apple's interests against developers and their customers is a front group for Apple no matter how it brands itself the group said. Apple has also opposed antitrust legislation in more direct ways.CEO Tim Cook publicly spoke out against legislation that would require Apple to allow sideloading and the company ramped up federal lobbying expenditures and became one of the main funders of a new group called the Chamber of Progress . Join the Ars Orbital Transmission mailing list to get weekly updates delivered to your inbox. Sign me up CNMN Collection WIRED Media Group 2023 Cond Nast. All rights reserved. Use of and/or registration on any portion of this site constitutes acceptance of our User Agreement (updated 1/1/20) and Privacy Policy and Cookie Statement (updated 1/1/20) and Ars Technica Addendum (effective 8/21/2018). Ars may earn compensation on sales from links on this site. Read our affiliate link policy . Your California Privacy Rights | Do Not Sell My Personal Information The material on this site may not be reproduced distributed transmitted cached or otherwise used except with the prior written permission of Cond Nast. Ad Choices
1,123
GOOD
Apple kills plans to scan for CSAM in iCloud (wired.com) To revist this article visit My Profile then View saved stories . To revist this article visit My Profile then View saved stories . Lily Hay Newman In August 2021 Apple announced a plan to scan photos that users stored in iCloud for child sexual abuse material (CSAM). The tool was meant to be privacy-preserving and allow the company to flag potentially problematic and abusive content without revealing anything else. But the initiative was controversial and it soon drew widespread criticism from privacy and security researchers and digital rights groups who were concerned that the surveillance capability itself could be abused to undermine the privacy and security of iCloud users around the world. At the beginning of September 2021 Apple said it would pause the rollout of the feature to collect input and make improvements before releasing these critically important child safety features. In other words a launch was still coming. Now the company says that in response to the feedback and guidance it received the CSAM-detection tool for iCloud photos is dead. Instead Apple told WIRED this week it is focusing its anti-CSAM efforts and investments on its Communication Safety features which the company initially announced in August 2021 and launched last December. Parents and caregivers can opt into the protections through family iCloud accounts. The features work in Siri Apples Spotlight search and Safari Search to warn if someone is looking at or searching for child sexual abuse materials and provide resources on the spot to report the content and seek help. Additionally the core of the protection is Communication Safety for Messages which caregivers can set up to provide a warning and resources to children if they receive or attempt to send photos that contain nudity. The goal is to stop child exploitation before it happens or becomes entrenched and reduce the creation of new CSAM. After extensive consultation with experts to gather feedback on child protection initiatives we proposed last yearweare deepening our investment in the Communication Safety feature that we first made available in December 2021 the company told WIRED in a statement. We have further decided to not move forward with our previously proposed CSAM detection tool for iCloud Photos.Children can be protected withoutcompaniescombing through personal data and we will continueworking with governments child advocates and other companies to help protect young people preserve their right to privacy and make the internet a safer place for children and for us all. Apples CSAM update comes alongside its announcement today that the company is vastly expanding its end-to-end encryption offerings for iCloud including adding the protection for backups and photos stored on the cloud service. Child safety experts and technologists working to combat CSAM have often opposed broader deployment of end-to-end encryption because it renders user data inaccessible to tech companies making it more difficult for them to scan and flag CSAM. Law enforcement agencies around the world have similarly cited the dire problem of child sexual abuse in opposing the use and expansion of end-to-end encryption though many of these agencies have historically been hostile toward end-to-end encryption in general because it can make some investigations more challenging. Research has consistently shown though that end-to-end encryption is a vital safety tool for protecting human rights and that the downsides of its implementation do not outweigh the benefits. Communication Safety for Messages is opt-in and analyzes image attachments users send and receive on their devices to determine whether a photo contains nudity. The feature is designed so Apple never gets access to the messages the end-to-end encryption that Messages offers is never broken and Apple doesnt even learn that a device has detected nudity. The company told WIRED that while it is not ready to announce a specific timeline for expanding its Communication Safety features the company is working on adding the ability to detect nudity in videos sent through Messages when the protection is enabled. The company also plans to expand the offering beyond Messages to its other communication applications. Ultimately the goal is to make it possible for third-party developers to incorporate the Communication Safety tools into their own applications. The more the features can proliferate Apple says the more likely it is that children will get the information and support they need before they are exploited. Beth Simone Noveck Amanda Hoover Brenda Stolyar Adrienne So Potential child exploitation can be interrupted before it happens by providing opt-in tools for parents to help protect their children from unsafe communications the company said in its statement. Apple is dedicated to developing innovative privacy-preserving solutions to combat Child Sexual Abuse Material and protect children while addressing the unique privacy needs of personal communications and data storage. Similar to other companies that have grappled publicly with how to address CSAMincluding MetaApple told WIRED that it also plans to continue working with child safety experts to make it as easy as possible for its users to report exploitative content and situations to advocacy organizations and law enforcement. Technology that detects CSAM before it is sent from a childs device can prevent that child from being a victim of sextortion or other sexual abuse and can help identify children who are currently being exploited says Erin Earp interim vice president of public policy at the anti-sexual violence organization RAINN. Additionally because the minor is typically sending newly or recently created images it is unlikely that such images would be detected by other technology such as Photo DNA. While the vast majority of online CSAM is created by someone in the victims circle of trust which may not be captured by the type of scanning mentioned combatting the online sexual abuse and exploitation of children requires technology companies to innovate and create new tools. Scanning for CSAM before the material is sent by a childs device is one of these such tools and can help limit the scope of the problem. Countering CSAM is a complicated and nuanced endeavor with extremely high stakes for kids around the world and its still unknown how much traction Apples bet on proactive intervention will get. But tech giants are walking a fine line as they work to balance CSAM detection and user privacy. Updated 5:20pm ET Wednesday December 7 2022 to include commentary from RAINN. Dont miss our biggest stories delivered to your inbox every day Our new podcast wants you to Have a Nice Future Reality TV saved me The team of sleuths hunting cyberattack-for-hire services Self-driving cars are being put on a data diet Google doesnt need you to buy its folding phone Your dog is a secret weapon in the fight against cancer Our Gear team sounds off on audiophile-grade speakers vinyl accessories and the best wireless headphones for anyone Gilad Edelman Scott Gilbertson Amanda Hoover Reece Rogers Dell Cameron Lily Hay Newman Matt Burgess Justin Ling More From WIRED Contact 2023 Cond Nast. All rights reserved. Use of this site constitutes acceptance of our User Agreement and Privacy Policy and Cookie Statement and Your California Privacy Rights. WIRED may earn a portion of sales from products that are purchased through our site as part of our Affiliate Partnerships with retailers. The material on this site may not be reproduced distributed transmitted cached or otherwise used except with the prior written permission of Cond Nast. Ad Choices
1,124
BAD
Apple orders entire supply of TSMC's 3nm chips for iPhone 15 Pro and M3 Macs (macrumors.com) Apple has reportedly secured all available orders for N3 TSMC's first-generation 3-nanometer process that is likely to be used in the upcoming iPhone 15 Pro lineup as well as new MacBooks scheduled for launch in the second half of 2023. According to a paywalled DigiTimes report Apple has procured 100% of the initial N3 supply which is said to have a high yield despite the higher costs involved and the decline in the foundry's utilization rate in the first half of 2023. Mass production of TSMC's 3nm process began in late December and the foundry has scaled up process capacity at a gradual pace with monthly output set to reach 45000 wafers in March according to the report's sources. Apple is widely expected to adopt TSMC's 3nm technology this year for the A17 Bionic chip likely to power the iPhone 15 Pro and iPhone 15 Pro Max models. The 3nm technology is said to deliver a 35% power efficiency improvement over 4nm which was used to make the A16 Bionic chip for the iPhone 14 Pro and Pro Max. The latter two iPhone models were the first smartphones to feature chips built on the 4nm process and it looks like Apple is again attempting to be first to market with models based on the latest cutting-edge semiconductor technology. Apple plans to release a new MacBook Air in the second half of 2023 and it may be equipped with a 3nm chip according to a January report from DigiTimes . However display industry analyst Ross Young in December claimed that a 15-inch MacBook Air would be released in the first half of 2023. If DigiTimes ' outlook turns out to be accurate then perhaps both 13-inch and 15-inch MacBook Airs with M3 chips based on 3nm technology will launch in the second half of 2023 instead. Looking further ahead Apple analyst Ming-Chi Kuo believes 14 and 16-inch MacBook Pros coming in 2024 will feature M3 Pro and M3 Max chips that are built on TSMC's 3nm process . MacBook Pro models with the M3 Pro and M3 Max chips will go into mass production in the first half of 2024 according to Kuo. The 3nm technology will offer improved performance and better power efficiency compared to the current chips manufactured on a 5-nanometer process including the M2 Pro found in Apple's current high-end Mac mini and the M2 Pro and M2 Max used in its latest 14 and 16-inch MacBook Pro models. TSMC is poised to move to N3E an enhanced version of N3 its first-generation 3nm technology to commercial production in the second half of this year and Apple will be the first customer to adopt the process according to another report this week from DigiTimes . Nikkei Asia reported in September that Apple could adopt N3E for devices launching as soon as this year but we've not seen any other reports corroborating this roadmap. Get weekly top MacRumors stories in your inbox. A selection of macOS tips to make your Mac life a more effortless experience. A selection of quick iOS tips that will make you a lot more time-efficient in the long run. 50 features and changes you might have missed in macOS Ventura. Apple on May 18 released iOS 16.5 delivering several sports-related enhancements for Apple News a new Pride Celebration wallpaper and several important bug and security fixes. Apple's new AR/VR headset is expected to be unveiled along with iOS 17 macOS 14 and more. Apple's most powerful Mac will finally shift to Apple silicon. Apple's AR/VR headset is coming soon with eye- and gesture-tracking dual 4K displays M-series chips and more. Here's what we know so far. Next-generation version of iOS set to be previewed at WWDC 2023 in June with a public release in September. 4 days ago by Joe Rossignol 6 days ago by Joe Rossignol 6 days ago by Hartley Charlton 1 week ago by Hartley Charlton 1 week ago by Hartley Charlton MacRumors attracts a broad audience of both consumers and professionals interested in the latest technologies and products. We also boast an active community focused on purchasing decisions and technical aspects of the iPhone iPod iPad and Mac platforms.
1,135
BAD
Apple passwords deserve an app (cabel.com) A Blog by Cabel Sasser First lets consider three random tweets: They already have. They definitely should. They seemingly wont but why? We all know that Apple has nice built-in password management in macOS and iOS. But very very few people know that Apples passwords can also : (And it all syncs across your devices for free?!) Very few people know these things because Apple tucks all of their important password features away in weird little Settings panels instead of in a Proper Real App. I think this is a mistake. Passwords are productivity not preferences. (heck yeah now im really sounding like a tech blogger!! what I mean is i deal with passwords more than i write docs in Pages you get it right) Every Apple user should know about these features. As much as I love 1Password (and I do! despite big fears about their venture funding! oh no here I go againnn!) there are people who cant afford to pay a subscription for a password manager or people who dont even know they exist in the first place. They deserve easy-to-use security too. Here an example of what happens when you jam critical functionality into weird-ass places: All that text is a big red flag reading this thing shouldve been a proper UI. I cant blame the designer here. Its not their fault. Since someone decided passwords shouldnt be given a full proper UI 2FA code setup was relegated to contextual menus and control-clicks. The designer knows the user is unlikely to ever find that on their own so they have to give them little hint somewhere . Why not when they click the Enter Setup Key button to manually do this? The problem is: 99% of users will never find that button in the first place let alone see this text. ( Bonus Panic UI Tip: people cant/wont read more than three (3) lines of static descriptive text. ) Pretty simple. In my dumble opinion Apple should: The idea is not to replace third-party password apps as I do not wish a Sherlocking on anyone. Those apps should and currently do offer more features than Apple ever will like cross-platform support team/family password sharing etc. The idea is that every Apple user deserves as a baseline feature in 2023 obvious clear powerful and easy-to-use password management. I yield my time PS: there is one place where Apple provides a nice standalone iCloud Passwords management app with a fully resizable window. On Windows!?! PPS: I dream of a future where Passkeys could make the password manager extinct. But itll take time replacing the ancient user-unfriendly Keychain Manager in the bargain. While were at it can we please get TOTP autofill on Chrome? It looks like the closest thing is this extension but its blocked on Macs https://imgur.com/a/u6mrNNU Im likely ignorant of constraints here but boy does it feel like this extension already does most of the heavy lifting why cant it autofill iCloud TOTPs into Chrome? Meanwhile the reason Chromes password manager seems insufficient here AFAIK is because it doesnt support 2FA codes at all. (Presumably by design) Two factor codes should not really be stored alongside the username and password. This ideal may also be why Apple are not exposing their features along these lines in a more readily available way. I suspect the main reason they havent made it a stand alone app is because they made Apple apps deletable a few years ago and have likely seen people who dont realize they could benefit from something having deleted an app after phone setup. Having it part Settings guarantees its on every iOS device. Having said that I completely agree its too buried and too many people have missed what it can do for them. Its not convenient to find a specific password since you have to open settings app swipe to find and press password and type the keyword. Luckily I find the URL Scheme for the workflow that apple never tells: app-prefs:PASSWORDS&search=[keyword] [] Read More [] I am digging the irony of the existing of the standalone iCloud Passwords management app on Windows. I am loving the irony of the fact that there is a standalone iCloud Passwords management app for Windows. That app already exists. Its called Keychain Access on macOS. Someone shared this shortcut with me https://rmondello.com/passwords-shortcut/ Apple password manager is awful for one reason: my most valuable accounts are all unlockable with my phone passcode which I type in public all the time. If someone has my phone and passcode they have everything. The only reason I have a 3rd party (2Stables Authenticator) 2FA app is because I tried using Apples builtin 2FA in the preferences password manager and it just wasnt convenient to access too many clicks. If apple was smart theyd create a seperate password/2fa app accessible from a Macs menu bar with an app on iOS. One of my favorite menu Shortcuts in Ventura is this one I call iCloud Passwords provides quick access like an app! https://rmondello.com/passwords-shortcut/ This is one of my favorite menu Shortcuts installed in Ventura. Quick access to passwords like an app! It is arguably cross-platform. https://apps.microsoft.com/store/detail/icloud/9PKTQ5699M62 One of my favorite menu Shortcuts is this one: https://rmondello.com/passwords-shortcut/ It provides quick access to passwords in the iCloud keychain. Like an app would What Apple ultimately wants is a FIDO-based passwordless experience. Hence it is highly unlikely that they will release a dedicated app to manage passwords. [] Theres an App for That Unless Youre Apples Passwords Feature [] I use the Apple passwords feature as well as 1Password and youre spot on with why they should make this an app. The main reason I stick primarily with 1Password is performance. I have 1700+ passwords in my Apple keychain/iCloud password thing and its quite laggy. Also if youre attempting to manage passwords that it highlights with security issues its about 5 clicks each to resolve. Bad UI. You spend the entire article saying people dont KNOW that its there!. Yes I am one of those people and re-read the article three times wondering WHERE IS IT? Please give us at least a hint of where this password feature might be hiding! Please! Right there with you Buddy! This was a very frustrating article. [] Read the article [] [] Apple Passwords Deserve An App cabel.com [] I cant believe there is no way to edit the name or URL of a password entry. If that were possible I could be happy with the current Passwords feature. Im not convinced Apple will ever get this right. It seems pretty clear their security team take security is a compromise between ease of use and security to the extreme edge to the point where its often lets deliberately break ease of use. [] on his blog this week Panics Cabel Sasser highlighted some tweets from Apple users who seemingly had no idea Apple offers any sort of [] Apple sucks at the details anymore. Everythings becoming buggy enterprise software without an opinion. Copy / paste on the iPhone didnt exist until it was right. Nothing is great anymore all just fine. I agree it should be an app but just not by Apple. A standalone password app would be great. But I really want Apple to make a budget app. Something simple clean. Who knows how often Ill update this. But you can get emailed when I do !! Type your email Subscribe 2023 Cabel Sasser Mastodon Twitter Create a website or blog at WordPress.com
1,137
BAD
Apple previews Live Speech Personal Voice and more new accessibility features (apple.com) Text of this article May 16 2023 PRESS RELEASE Apple introduces new features for cognitive accessibility along with Live Speech Personal Voice and Point and Speak in Magnifier New software features for cognitive speech and vision accessibility are coming later this year CUPERTINO CALIFORNIA Apple today previewed software features for cognitive vision hearing and mobility accessibility along with innovative tools for individuals who are nonspeaking or at risk of losing their ability to speak. These updates draw on advances in hardware and software include on-device machine learning to ensure user privacy and expand on Apples long-standing commitment to making products for everyone. Apple works in deep collaboration with community groups representing a broad spectrum of users with disabilities to develop accessibility features that make a real impact on peoples lives. Coming later this year users with cognitive disabilities can use iPhone and iPad with greater ease and independence with Assistive Access; nonspeaking individuals can type to speak during calls and conversations with Live Speech; and those at risk of losing their ability to speak can use Personal Voice to create a synthesized voice that sounds like them for connecting with family and friends. For users who are blind or have low vision Detection Mode in Magnifier offers Point and Speak which identifies text users point toward and reads it out loud to help them interact with physical objects such as household appliances. At Apple weve always believed that the best technology is technology built for everyone said Tim Cook Apples CEO. Today were excited to share incredible new features that build on our long history of making technology accessible so that everyone has the opportunity to create communicate and do what they love. Accessibility is part of everything we do at Apple said Sarah Herrlinger Apples senior director of Global Accessibility Policy and Initiatives. These groundbreaking features were designed with feedback from members of disability communities every step of the way to support a diverse set of users and help people connect in new ways. Assistive Access Supports Users with Cognitive Disabilities Assistive Access uses innovations in design to distill apps and experiences to their essential features in order to lighten cognitive load. The feature reflects feedback from people with cognitive disabilities and their trusted supporters focusing on the activities they enjoy and that are foundational to iPhone and iPad: connecting with loved ones capturing and enjoying photos and listening to music. Assistive Access includes a customized experience for Phone and FaceTime which have been combined into a single Calls app as well as Messages Camera Photos and Music. The feature offers a distinct interface with high contrast buttons and large text labels as well as tools to help trusted supporters tailor the experience for the individual they support. For example for users who prefer communicating visually Messages includes an emoji-only keyboard and the option to record a video message to share with loved ones. Users and trusted supporters can also choose between a more visual grid-based layout for their Home Screen and apps or a row-based layout for users who prefer text. The intellectual and developmental disability community is bursting with creativity but technology often poses physical visual or knowledge barriers for these individuals said Katy Schmid senior director of National Program Initiatives at The Arc of the United States. To have a feature that provides a cognitively accessible experience on iPhone or iPad that means more open doors to education employment safety and autonomy. It means broadening worlds and expanding potential. Live Speech and Personal Voice Advance Speech Accessibility With Live Speech on iPhone iPad and Mac users can type what they want to say to have it be spoken out loud during phone and FaceTime calls as well as in-person conversations. Users can also save commonly used phrases to chime in quickly during lively conversation with family friends and colleagues. Live Speech has been designed to support millions of people globally who are unable to speak or who have lost their speech over time. For users at risk of losing their ability to speak such as those with a recent diagnosis of ALS (amyotrophic lateral sclerosis) or other conditions that can progressively impact speaking ability Personal Voice is a simple and secure way to create a voice that sounds like them. Users can create a Personal Voice by reading along with a randomized set of text prompts to record 15 minutes of audio on iPhone or iPad. This speech accessibility feature uses on-device machine learning to keep users information private and secure and integrates seamlessly with Live Speech so users can speak with their Personal Voice when connecting with loved ones. 1 At the end of the day the most important thing is being able to communicate with friends and family said Philip Green board member and ALS advocate at the Team Gleason nonprofit who has experienced significant changes to his voice since receiving his ALS diagnosis in 2018. If you can tell them you love them in a voice that sounds like you it makes all the difference in the world and being able to create your synthetic voice on your iPhone in just 15 minutes is extraordinary. Detection Mode in Magnifier Introduces Point and Speak for Users Who Are Blind or Have Low Vision Point and Speak in Magnifier makes it easier for users with vision disabilities to interact with physical objects that have several text labels. For example while using a household appliance such as a microwave Point and Speak combines input from the camera the LiDAR Scanner and on-device machine learning to announce the text on each button as users move their finger across the keypad. 2 Point and Speak is built into the Magnifier app on iPhone and iPad works great with VoiceOver and can be used with other Magnifier features such as People Detection Door Detection and Image Descriptions to help users navigate their physical environment. Additional Features Celebrating Global Accessibility Awareness Day Around the World To celebrate Global Accessibility Awareness Day this week Apple is introducing new features curated collections and more: Press Contacts Will Butler Apple willbutler@apple.com Eric Hollister Williams Apple e_hollisterwillia@apple.com Apple Media Helpline media.help@apple.com Images in this article Apple revolutionized personal technology with the introduction of the Macintosh in 1984. Today Apple leads the world in innovation with iPhone iPad Mac Apple Watch and Apple TV. Apples five software platforms iOS iPadOS macOS watchOS and tvOS provide seamless experiences across all Apple devices and empower people with breakthrough services including the App Store Apple Music Apple Pay and iCloud. Apples more than 100000 employees are dedicated to making the best products on earth and to leaving the world better than we found it. Will Butler Apple willbutler@apple.com Eric Hollister Williams Apple e_hollisterwillia@apple.com Apple Media Helpline media.help@apple.com The latest news and updates direct from Apple. Read more
1,138