Unnamed: 0
int64
0
3k
title
stringlengths
4
200
text
stringlengths
21
100k
url
stringlengths
45
535
authors
stringlengths
2
56
timestamp
stringlengths
19
32
tags
stringlengths
14
131
200
All data on Google Play Music will be deleted on February 24
All data on Google Play Music will be deleted on February 24 Lot Bd Feb 22·2 min read All data on Google Play Music will be deleted on February 24 Popular app Google Play Music has been shut down. It has been heard that all the data of Google Play Music will be deleted on February 24. However, customers can transfer data to YouTube Music before they want. There are also download and delete options. Google Play Music was discontinued in December 2020. Google has already informed customers via e-mail that all their data will be deleted from Google Play Music on February 24. So those who want, they should download or transfer the data. Basically, all the data in the library, upload, purchase information in Google Play Music will be deleted. After February 24, there will be no way to recover all this data. How to transfer data to YouTube? Users can go to music.google.com or Android or iOS version of the app to show the Transfer to YouTube option. Clicking here will open YouTube Music. Then there will be data transfer. All the information in the playlist, songs, albums, other purchased songs and downloaded or uploaded songs can be transferred. There will also be Manage your music option. This will allow customers to download the music library. In addition, customers will have the option to delete the history of their choice. Users can also delete the entire Google Play music library. If a customer wants to download a music library, they need to take the help of Google Takeout. This allows you to keep a copy of all the data in Google Play Music. The process of shutting down Google Play Music began in October last year. The app was officially shut down in December. For more Tech News: Lotbd
https://medium.com/@lotbd/all-data-on-google-play-music-will-be-deleted-on-february-24-4ab3b4c20989
['Lot Bd']
2021-02-22 10:16:40.370000+00:00
['Technical Analysis', 'Technews', 'Technology', 'Technology News', 'Tech']
201
Tools for using Kubernetes
Tools for using Kubernetes Tools for a team of any level to realize a container architecture. Kubernetes, the container orchestration tool originally developed by Google, has become a defacto for Agile and DevOps teams. With the advance of ML, Kubernetes has become even more important for an organization. Here, we have summed up a list of tools which can be used to realize a container architecture for different phases and maturity levels for enterprise organizations. Kubectl The most important area for Devops is command line. Kubectl is the command line tool for Kubernetes that controls the Kubernetes cluster manager. Under Kubectl, there are several subcommands for more precise cluster management control, such as converting files between different API versions, or executing container commands. It is also the basis of many other tools in the ecosystem. kuttle: kubectl wrapper — Kubernetes wrapper for sshuttle kubectl sudo — kubernetes cmd with the security privileges of another user mkubectx — single command across for all your selected kubernetes contexts Kubectl-debug — Debugging the pod by a new container with troubleshooting tools pre-installed Minikube The next important area is development. Minikube is a great Kubernetes tool for development and testing. Minikube is used by teams to get started on and build POCs using Kubernetes. It can be used to run a single-node Kubernetes cluster locally for development and testing. There are plenty of Kubernetes features supported on Minikube, including DNS, NodePorts, ConfigMaps and Secrets, Dashboards, Container Runtime (Docker, rkt, and CRI-O), enabling CNI’s, and ingress. This step-by-step guide for a quick and easy installation. KubeDirector Once the team has build extensively, it will need to scale out the clusters. It brings Enterprise level capabilities for Kubernetes. KubeDirector uses standard Kubernetes facilities of custom resources and API extensions to implement stateful scaleout application clusters. This approach enables transparent integration with user/resource management and existing clients and tools. Prometheus Each team has a need for operational metrics to define operational efficiency and ROI. Prometheus can be leveraged for providing alerting and monitoring infrastructure to Kubernetes native applications. Prometheus, a Cloud Native Computing Foundation project, is a systems and service monitoring system. It collects metrics from configured targets at given intervals, evaluates rule expressions, displays the results, and can trigger alerts if some condition is observed to be true. Prometheus provides the infrastructure but for metric analytics, dashboards and monitoring graphs, Grafana is used on top of Prometheus. Skaffold Once the team has spent time building a repeatable process for containerization with metrics and alerting, having CI/CD becomes the next phase of development. Skaffold is a command line tool that facilitates continuous development for Kubernetes applications. It helps the team to iterate on the application source code locally then deploy to local or remote Kubernetes clusters. Skaffold handles the workflow for building, pushing and deploying your application. It also provides building blocks and describes customization for a CI/CD pipeline. CI/CD will require test automation as well. The test-infra repository contains tools and configuration files for the testing and automation needs of the Kubernetes project. KubeFlow Once the products gather huge amounts of data, data pipelines and data products could be build for these applications. Kubeflow is a Cloud Native platform for machine learning based on Google’s internal machine learning pipelines.
https://medium.com/acing-ai/tools-for-using-kubernetes-84d47a73ef2e
['Vimarsh Karbhari']
2020-06-11 13:29:53.152000+00:00
['Containers', 'Data Science', 'Artificial Intelligence', 'Technology', 'Data Engineering']
202
BootstrapVue — Spin Button. BootstrapVue has an easy to use number…
Photo by Mike Benna on Unsplash To make good looking Vue apps, we need to style our components. To make our lives easier, we can use components with styles built-in. In this article, we’ll look at how to add a spin button. Spin Button A spin button is a numerical range form control. It’s available since Bootstrap 2.5.0. We can include it with the b-form-spinbutton component. For example, we can write: <template> <div id="app"> <b-form-spinbutton v-model="value" min="1" max="100"></b-form-spinbutton> <p>Value: {{ value }}</p> </div> </template> <script> export default { name: "App", data() { return { value: 0 }; } }; </script> We set the minimum and maximum allowed values with the min and max props. v-model binds the inputted value to value . Then we see a form control with a minus button on the left and a plus button on the right. The number entered will be shown in the middle. We also display the value of value in the p element. Step The step prop lets us change the increment of the minus and plus buttons. For example, we can write: <template> <div id="app"> <b-form-spinbutton v-model="value" step="0.5" min="1" max="100"></b-form-spinbutton> <p>Value: {{ value }}</p> </div> </template> <script> export default { name: "App", data() { return { value: 0 }; } }; </script> Then the minus and plus buttons decrement and increment by 0.5 respectively. Number Wrapping By default, when we click the plus button and it reaches the max value, it does nothing if we click it again. The minus button has the same behavior for decrementing values. Instead of doing nothing, we add the wrap prop so that the number is wrapped: <template> <div id="app"> <b-form-spinbutton wrap v-model="value" min="1" max="5"></b-form-spinbutton> <p>Value: {{ value }}</p> </div> </template> <script> export default { name: "App", data() { return { value: 0 }; } }; </script> Now when we click the plus button and the value goes beyond the maximum, it’ll go back to 1. Likewise, if we decrement to the minimum value we’ll go back to 5. Size We can change the size of it with the size prop. For example, we can write: <template> <div id="app"> <b-form-spinbutton size="sm" v-model="value" min="1" max="5"></b-form-spinbutton> <p>Value: {{ value }}</p> </div> </template> <script> export default { name: "App", data() { return { value: 0 }; } }; </script> Then the input will be extra small. We can also set size to 'lg' to make it extra large. Inline The inline makes the input inline. For instance, we can write: <template> <div id="app"> <b-form-spinbutton inline v-model="value" min="1" max="5"></b-form-spinbutton> <p>Value: {{ value }}</p> </div> </template> <script> export default { name: "App", data() { return { value: 0 }; } }; </script> Vertical We can make the layout vertical by using the vertical prop: <template> <div id="app"> <b-form-spinbutton vertical v-model="value" min="1" max="5"></b-form-spinbutton> <p>Value: {{ value }}</p> </div> </template> <script> export default { name: "App", data() { return { value: 0 }; } }; </script> Width We can control the width with utility classes like w-25 or w-50 . Number Formatting and Locale We can format the numbers our way. To set the locale, we can use the locale prop. For example, we write: <template> <div id="app"> <b-form-spinbutton locale='fr-ca' v-model="value" step="0.1" min="1" max="5"></b-form-spinbutton> <p>Value: {{ value }}</p> </div> </template> <script> export default { name: "App", data() { return { value: 0 }; } }; </script> Then we see the French Canadian representation of decimal numbers. Also, we can use the formatter-fn prop to run a function that formats the displayed number our way. For instance, we can write: <template> <div id="app"> <b-form-spinbutton :formatter-fn="fruitFormatter" v-model="value" wrap min="0" :max="fruits.length - 1" ></b-form-spinbutton> <p>Value: {{ value }}</p> </div> </template> <script> export default { name: "App", data() { return { value: 0, fruits: ["apple", "orange", "grape"] }; }, methods: { fruitFormatter(value) { return this.fruits[value]; } } }; </script> We set max to fruits.length — 1 , min to 0, and add the wrap prop so that we can cycle through the index os the array in fruitFormatter . value is the number that’s inputted, which is the index. So we see the fruits entries displayed. Photo by Katerina Kerdi on Unsplash Conclusion We can do a lot with the spin button, which lets us display a numeric input with an increment and decrement buttons. The inputs can be formatted and the locale can be set. JavaScript In Plain English Did you know that we have four publications and a YouTube channel? Find them all at plainenglish.io and subscribe to our YouTube channel!
https://medium.com/javascript-in-plain-english/bootstrapvue-spin-button-ddada4204b6f
['John Au-Yeung']
2020-06-30 15:43:44.100000+00:00
['Programming', 'Web Development', 'JavaScript', 'Software Development', 'Technology']
203
Jasmine — Test Array, Strings, and Time-Dependent Code
Photo by Elena Mozhvilo on Unsplash Testing is an important part of JavaScript. In this article, we’ll look at how to create more complex tests with Jasmine. Array Checks We can use the jasmine.arrayContaining method to check the content of the array. For example, we can write: describe("jasmine.arrayContaining", function () { let foo; beforeEach(function () { foo = [1, 2, 3]; }); it("matches arrays with some of the values", function () { expect(foo).toEqual(jasmine.arrayContaining([3, 1])); expect(foo).not.toEqual(jasmine.arrayContaining([6])); }); describe("when used with a spy", function () { it("is useful when comparing arguments", function () { const callback = jasmine.createSpy('callback'); callback([1, 2, 3, 4]); expect(callback) .toHaveBeenCalledWith( jasmine.arrayContaining([4, 2, 3]) ); expect(callback) .not .toHaveBeenCalledWith( jasmine.arrayContaining([5, 2]) ); }); }); }); We have the jasmine.arrayContaining method to check for the numbers in the array. It returns true if the array being checked has all the items in the array we passed into arrayContaining . We can do the same with function arguments. String Matches We can use the jasmine.stringMatching to check if a string has a given substring or pattern. For instance, we can write: describe('jasmine.stringMatching', function () { it("matches as a regex", function () { expect({ foo: 'baz' }) .toEqual({ foo: jasmine.stringMatching(/^baz$/) }); expect({ foo: 'foobarbaz' }) .toEqual({ foo: jasmine.stringMatching('bar') }); }); describe("when used with a spy", function () { it("is useful for comparing arguments", function () { const callback = jasmine.createSpy('callback'); callback('foobarbaz'); expect(callback) .toHaveBeenCalledWith( jasmine.stringMatching('baz') ); expect(callback) .not .toHaveBeenCalledWith( jasmine.stringMatching(/^baz$/) ); }); }); }); We have the jasmine.stringMatching method with a regex and a string. It lets us check for a regex pattern and a substring. Asymmetric Equality Tester We can create our own equality tester to do our check. For instance, we can write: describe("custom asymmetry", function () { const tester = { asymmetricMatch(actual) { return actual.includes('bar'); } }; it("dives in deep", function () { expect("foo,bar,baz,quux").toEqual(tester); }); describe("when used with a spy", function () { it("is useful for comparing arguments", function () { const callback = jasmine.createSpy('callback'); callback('foo,bar,baz'); expect(callback).toHaveBeenCalledWith(tester); }); }); }); We created our own tester object with the asymmericMatch method to check for what we want. We just return a boolean expression with the condition we want to check. Then we can use it with the toHaveBeenCalledWith or toEqual . Therefore, we can check for the values and the arguments. Jasmine Clock We can test time-dependent code with the jasmine.clock method. For example, we can write: describe("custom asymmetry", function () { let timerCallback; beforeEach(function () { timerCallback = jasmine.createSpy("timerCallback"); jasmine.clock().install(); }); afterEach(function () { jasmine.clock().uninstall(); }); it("causes a timeout to be called synchronously", function () { setTimeout(function () { timerCallback(); }, 100); expect(timerCallback).not.toHaveBeenCalled(); jasmine.clock().tick(101); expect(timerCallback).toHaveBeenCalled(); }); }); We have the jasmine.clock().install() to make Jasmine mock the time. Then we call jasmine.clock().uninstall() to remove the Jasmine clock. In our test, we can change the time to what we want. And then we can do our checks after some amount of time is elapsed. Photo by Simon Rae on Unsplash Conclusion We can test time-dependent code with Jasmine’s clock methods. Also, we can check for array and string items with built-in functions. And we can make our own matches to test what we want.
https://medium.com/dev-genius/jasmine-test-array-strings-and-time-dependent-code-29397e7e04fd
['John Au-Yeung']
2020-11-08 18:24:56.039000+00:00
['Programming', 'Technology', 'JavaScript', 'Software Development', 'Web Development']
204
Genaro Network (GNX) Bi-Weekly Tech Report
[Highlights] 1. The storage settlement extension of the alliance chain is established; 2. The new bridge design is completed, in addition to the original encrypted files upload, added the public files upload option; 3. The mining pool is functionally tested. The test of the shadow account and the mining bonus account after the update of the public chain program confirmed that the load is stable and the operation is smooth; Ensured that the shadow account has the same mining rights as the mining account, the synchronization block is stable, and the node assets are secure; Fabric browser testing and use: block query function, data pipeline function, data contract function; Checked account balance, public key, synchronized transaction data, transaction time, assets, transferor/receiver, handling fee, etc. Quotum costs minimization, building and user testing of dual-node binding mode, completed 40%; Universal service transfer of SDK chain code, high load balancing completed. Users are able to upload encrypted files, files without encryption and resume transmission. Upload files: 1. Not encrypted. This option is used to upload non-sensitive data, or data that the user has already encrypted, so as to avoid unnecessary waste of resources and also enable fast uploading; 2. Encryption, using symmetric encryption, the system automatically generates a key, which is encrypted by the user’s asymmetric key and saved to the server. Symmetric encryption does not affect its file size. Authorized upload Whether the authorization via the bridge allows accepting data into the network. Proactive authorization is used in consideration with user trustworthiness issues and limited resources to protect the network. Authorization/pre-allocation A. If the size of the uploaded file is less than or equal to the remaining space of the user (bucket), the bridge temporarily locks the space of the corresponding size and makes final confirmation after the upload is completed. B. Otherwise, the pre-allocation fails, prompting the user has run out of space or compress the file and try again. 2. Hash check If the user does not choose to encrypt, the hash is calculated first (after compression), and the file is checked according to the hash to see if the file already exists in the network. If it exists, the function is directly uploaded, that is, the quick upload. Upload Client file uploading 1.Request a farmer list The bridge returns the farmer list based on file size, fixed sharding strategy, and other conditions. The farmer in the list can be repeated to indicate the recipient of each fragment. 2. Request upload to farmer The farmer requests confirmation from the bridge. 3. Start uploading Complete the breakpoint 4. After the upload is completed, the farmer returns the hash of the fragment. After the client confirms, it notifies the bridge. Upload success When all the shards have been uploaded successfully, the system automatically assumes that the file has been uploaded. Modified the mining pool shutter of adding the margin, causing the transaction to be blocked; Fixed the bug when after mining is finished, the mining pool ends the synchronization block according to the default trading block, so as to avoid the multiple mining caused by the synchronization information not being timely; Mining pool reward model design, internal testing; The mining pool is linked to the test chain, and the test transaction data is synchronized to ensure smooth transactions. Breaking News: The first edition of the GSIOP protocol is officially released On February 20, 2019, Singapore time, Genaro Network, the world’s first smart data ecosystem with a Dual-Strata Architecture, integrating a public blockchain with decentralized storage officially released the first version of the GSIOP protocol. This is not only the product of nearly a year hard work of the Genaro entire team of engineers, but it also marks Genaro’s new milestone in the practical application of cross-chain technology. Breaking News: G.A.O. (Genaro Alpha One) is officially launched Genaro Network, the future of Smart Data Ecosystem for DApps, invites you to witness the new era of smart data, empowered by the revolutionary serverless interactive system! Recommended reading: Genaro public network mainnet officially launched | Community Guide Download Technical Yellow Paper Genaro’s latest versions, Genaro Eden and Genaro Eden Sharer, will allow you to store your files in a more secure way and share your unused storage to earn GNX. Get your Genaro Eden/Sharer for Linux, Windows and MAC OS right now from the official website: Git source repository is on GitHub>> Important: Warm reminder to our community members, please download Genaro Eden ONLY from our official website/GitHub and DO NOT trust any referral links and reposts from anyone, otherwise, we won’t be able to guarantee privacy and security of your data and protect you from scammers. Genaro Eden — The first decentralized application on the Genaro Network, providing everyone with a trustworthy Internet and a sharing community: Related Publications: Genaro’s Core Product Concept Genaro Eden: Five Core Features How Does Genaro’s Technology Stand Out? Genaro Eden Application Scenarios and User Experience The Genaro Ecosystem Matthew Roszak Comments on Release of Genaro Eden About Genaro Network The Genaro Network is the first smart data ecosystem with a Dual-Strata Architecture, integrating a public blockchain with decentralized storage. Genaro pioneered the combination of SPoR (Sentinel Proof of Retrievability) with PoS (Proof of Stake) to form a new consensus mechanism, ensuring stronger performance, better security and a more sustainable blockchain infrastructure. Genaro provides developers with a one-stop platform to deploy smart contracts and store the data needed by DAPPs simultaneously. Genaro Network’s mission is to ensure the secure migration of the core Internet infrastructure to the blockchain. Official Telegram Community: https://t.me/GenaroNetworkOfficial Telegram Community (Russian): https://t.me/GenaroNetworkOfficial_Rus
https://medium.com/genaro-network/genaro-network-gnx-bi-weekly-tech-report-13041fa33376
['Genaro Network', 'Gnx']
2019-07-21 19:39:23.292000+00:00
['Technology', 'Storage', 'Bitcoin', 'Dapp', 'Blockchain']
205
V8 engine, Why do we love it so much?
With every new vehicle that is being released it’s almost a given that at least a certain percentage of people will comment this, “They should’ve put a 8 cylinder engine in it.” We are in a new era of ICE engines (internal combustion engines) where the 6 cylinder engine will put out just as much power as an 8 cylinder. The modern 6 cylinder engines will most definitely put out more power than a much older V8. engine. So why are people so hung up on an 8 cylinder engine? Theoretically the bigger engine is much more less efficient than the smaller one. the V8 is heavier, uses more fuel, bigger, and more expensive when it comes to maintenance. Well if you haven’t driven an 8 cylinder sports car then it will be really difficult to understand. The feeling the driver gets when you press on the pedal of a V8 is not something you can compare to a 6 cylinder. The moment you press on the pedal you will feel power that the engine produces almost running through your veins. Suddenly everything around slows down and the only two things working together at that moment is just you and that car. The pedal opens the throttle, the throttle lets air into the manifold, the air travels into the cylinders through the valves where the cylinders move in perfect harmony to create the optimal power. The cylinders will open and close valves making a fusion of air, fuel, and spark. This will make an explosion that will create the necessary power to crank the drive train. All this power will get transferred to the wheels eventually. You as the driver can almost feel all this through the body when you press that foot down. You will hear the sound of the engine roar in the cabin and out the back of the car. This is the reason why people love 8 cylinder engines so much. Companies are not allowing us to experience the same feeling with a 6 cylinder. There is a lag from the moment you press the pedal to when the car responds. We know that 6 cylinder engines are capable of so much more. The cars from the fastest motorsport in the world, F1 Racing, are equipped with V6 engines. Currently the engines used in an F1 racing car is a turbocharged 1.6 liter V6. Also if you didn’t know the engine Ford used in the 24-hour race of Le Mans in 2016 was a 3.5 Liter turbocharged V6. So yes these V6 have the power necessary to defeat a V8 engine, we’ve seen it time and time again with imports as well. Unfortunately, I believe that the future of V8 is a sad one. Just like the headphone jack, or house phones we must evolve and let go of inefficient things. I can probably bet that the V8 will phase out before then of this decade. Manufactures will move to highly powerful V6 engines or fully electric vehicles, the latter being more likely. At this moment the only company that we can count on to continue to give us that adrenaline is Dodge Chrysler Jeep. They are the only company that continues to develop more powerful V8 engines every couple of years. If there’s any hope left for a V8 powered future then FCA is that silver lining. Thank you, stay buzzed.
https://medium.com/@bernardo.escobedo/v8-engine-why-do-we-love-it-so-much-c79985d47136
['Bernardo Escobedo']
2020-07-25 22:11:56.751000+00:00
['Automotive', 'Broncos', 'Tradition', 'Cars', 'Technology']
206
The Augmented Reality (almost) monthly — December 2020 edition
Siemens Energy selects Librestream’s Onsight Augmented Reality platform for its Connected Worker solution I’m very late to publish this article so you will find here interesting news since October. I have chosen to keep news in french from my original selection (available on augmented-reality.fr) because it allows us to discover many “local uses” which are very clever. So, what will you find below? First, many annonces for AR glasses. It could be strange but it is very clear sign of the maturity of B2B uses, especially in industries for control and remote expertise. If you remember, we saw same thing at the 2018 edition of the CES, just after the “Peak of Inflated Expectations” following ARkit and ARcore annonces. It is a better time now? Yes, because after a year of difficult meeting and travels, many industrial executive know the efficiency or AR for remote expertise and maintenance. By the way, you will also find such examples below. Another interesting fact, we saw many retail uses of AR since 6 months. Again, the pandemic situation is the obvious cause. It’s difficult to convince people to go in shop (when they can do that) and, if you’re a retailer it’s also difficult to fight against “pure internet players”. It’s why big brands and small shops explores more interactive experiences for customers, and AR. It’s a pity to see that other sectors, like culture, tourism or even food, are today globally very shy. Some big players has made experiments and could be the big winners of 2021. Anyway, let’s go to 2021, it’s almost time! I wish you the best possible end of the year festivities. Take a flute of Champagne and let’s see you next year. A la votre (cheers) Interesting uses Augmented Reality paper cubes to teach computational thinking Verizon brings Smithsonian artifacts into AR, invites teams to redefine museums | VentureBeat Google showcases miniature Indian artworks using AI and AR | blooloop Tear Down The Berlin Wall Wherever You Are With Augmented Reality Mixed Reality Revolution: Out-of-Home! — VRFocus Des livres qui prennent vie grâce à la réalité augmentée Inès Alpha : « L’étrange est une forme de beauté qui me touche beaucoup » BLAM app lets users erect augmented reality statues of historical black figures More retail See Coca-Cola’s Iconic Christmas Truck In Augmented Reality — ar.rocks 19 Crimes Uses Augmented Reality to Differentiate Their Brand Through Innovation | PTC Des filtres Snapchat pour Lacoste avec National Geographic — Image — CB News Wonder Partner’s installe le Père Noël en réalité augmentée dans les centres commerciaux — Le Journal des Entreprises — Loire-Atlantique — Vendée Web AR Scares Up an Augmented Halloween — AR Insider Augmented reality boosts conversion for Home Depot Louis Vuitton Unveils Men’s Temporary Residency in Miami | Complex L’Oréal Paris propose des filtres de maquillage virtuel pour vos visioconférences — BDM More industry NASA’s Using Augmented Reality to Transform Air Traffic Management | NASA ‘The future is now’: Kiremko Remote Service starts up new Moroccan french fry line remotely — Potato News Today Siemens Energy selects Librestream’s Onsight Augmented Reality platform for its Connected Worker solution | Auganix.org SpringCity create #augmentedreality workflow systems (Linkedin) Tomra releases Visual Assist to boost remote assistance | The Packer TechSee raises $30 million to streamline field service work with AR and computer vision | VentureBeat Potential of Augmented Reality Platforms to Improve Individual Hearing Aids and to Support More Ecologically Valid Research : Ear and Hearing Holopatient Remote Uses AR Holograms For Hands-On Medical Training — VRScout Audi is using augmented reality to increase efficiency in logistics planning | Audi MediaCenter News from providers (hardware and software) More hardware Mojo Vision teams up with optics leader Menicon to develop AR contact lenses | VentureBeat Microsoft launches HoloLens 2 Development Edition in the U.S. | Windows Central Introducing the new Google Meet experience for Glass Enterprise Edition 2 | Google Cloud Blog PhotonLens Unveils Snapdragon XR2-Powered Mixed Reality Glasses — AR Insider ARQUUS renews its connected glasses offer — Armada International Vodafone Partners with Nreal to Bring 5G Mixed Reality Glasses to Germany Epson announces a new generation of Moverio smart glass technology — Epson Oppo Unveils Smartphone-tethered AR Glasses, Push for Content Coming in 2021 — Road to VR Lidar on the iPhone 12 Pro: What it can do now, and why it matters for the future of AR, 3D scanning and photos — CNET Le LiDAR des iPhone 12 Pro d’Apple veut séduire les entreprises — Le Monde Informatique Charting a Path to Viable Consumer AR Glasses, Part I Tilt Five raises $7.5M in new funding to change augmented reality gaming — SiliconANGLE More software TeamViewer integrates its ‘Pilot’ Augmented Reality solution into latest update of its ‘Tensor’ enterprise platform | Auganix.org Korean startup Letsee to launch a global version of Web-AR solution Flipkart Group acquires augmented reality startup Scapic Unity Announces Unity Reflect Now Supports Autodesk BIM 360 for seamless AR/VR experiences | Business Wire Snap uses iPhone 12 Pro’s lidar sensor for AR imagery | VentureBeat Augmented Reality on the Web with Model Viewer | by Nicolò Carpignoli | Chialab Open Source | Oct, 2020 | Medium To own an AR future, Niantic wants to build a smarter map of the world | TechCrunch How Magic Leap is trying to reinvent itself (and AR) for 2021 | WIRED UK Vision and evolution studies Augmented Reality for Ecommerce: Is It Useful Yet? Augmented Reality Must Have Augmented Privacy | Electronic Frontier Foundation Mobile AR Users Approach 600 Million — AR Insider Quelle sera notre identité dans les mondes virtuels ? — Laval Virtual Projects (you have to know) Darlene : DEEP AR LAW ENFORCEMENT ECOSYSTEM Purdue Exploring Augmented Reality for Workforce Training — Inside INdiana Business
https://medium.com/@gmaubon/the-augmented-reality-almost-monthly-december-2020-edition-b1ad2c33e7a2
['Grégory Maubon']
2020-12-24 11:43:31.842000+00:00
['Industry', 'Remote Working', 'AR', 'Retail Technology', 'Augmented Reality']
207
Design systems: History to present and future potential
Unless you live on the moon you might have heard of the word “design system “! This revolutionizing concept made a new balance for the design world! Years ago we couldn’t imagine something more helpful than pattern libraries till Atomic design became a thing! And then got baked to become the mature concept of design systems that we nowadays know bringing new core values: consistency, efficiency, scale, and nothing less. So how it all started and where it might be going in the future? Design systems history: from early art-forms to actual systems Generally speaking, the roots of the design field’s history dated from the first human interest in logo branding and typography. The history of design systems is long and full of household names. Without digging deep in details, I would like to sum up the design systems history into phases 1/ The humble beginnings of the design concept The humble beginnings of the invention of movable type in the early 1400s, with the rise of neutral Typography, rare headings, and iconic pics. The term “graphic design” was invented by William Addison Dwiggins in 1922 to describe his process of designing books, as a combination of typesetting, illustration, and design. And never stopped growing with the rise of artistic movements in the 20th century. 2/ Industrial revolution and the attempts to mix art and technologies Some Italian artists thought it would be cool to be able to merge technology and the industrial revolution with art in the future. And so they started experimenting in typography, geometric forms, and color which slowly evolved into defining a design process. 3/ The start of computed design and the faced challenges by designers With the emergence of computers in human life and the first steps to process digitalization in administrations and government work, the design took another definition to become a system of relationships between all of the aspects of a problem to solve from shapes sizes, proportions tools, etc. The good appeal was so hard to achieve, mistakes and the lack of consistency became an insoluble problem. 4/ The invention of design guidelines and appearance of the functional design approach Guidelines and general aesthetics started getting defined by the SWISS with European modernism in 1990 in an attempt to create a rational Method and a codified approach that helps create better designs without over counting on the individual designer’s natural artistic talent. And so they categorized design problems and turned them into a series of simple constraints and rules. The move was followed by the use of the grid system to order page elements and the formulation of “ functional design”. And Designing programs emerged functioning on strict modular principles 5/ Evolution of UI/UX design With the establishment of the age of software, the first-ever graphical user interface was developed by Douglas and Alan kay in 1981 which soon evolved with the natural evolution of human interaction with text and digital screens to become a very important and trendy field. Then soon followed, in 1990 by the development of UX design known the user experience in the necessity of usability and user satisfaction. Both concepts were challenging to maintain success and required stricter design principles. 6/ Pattern libraries Before design systems were there in UI/UX design, designers used pattern libraries to make their work more repeatable. Pattern libraries collections of design elements that appear repeatedly on a site. The library helps define how elements look like, and how they are coded or provide UI/UX kits. yet it was still so challenging to maintain consistency and mistakes that were often noticeable. Followed by the framework bomb in 2010 where a major shift happened in digital products. Businesses started requiring the early mobile apps that ended up with huge problems of non-responsive UI and inconsistent UX 7/ Atomic design A question started rising: why bother to visually design every state of every screen while you can think more efficiently and operate by Assembling all components Line of UX/UI? And so started the adoption of a new technical approach that makes design components operate as an atomic system. The concept is simple: you operate on atoms that form molecules that functionalize organisms that can flexibly adapt to required change! 8/ The first design system Google’s Material Design was the first considerable design system. It debuted in 2014, leveraging the best possible practices of design, based on atomic approach and keeping with pattern library utility. The new method worked on providing a better material design, unifying visual language, and helping keep brand consistency. A concept that will mark the present Where are we today? Over the last 20 years. Design systems didn’t stop evolving till they became a strategic move that every business aims to have. As a general definition, a design system is a collection of the detailed standards of design and front-end coding made to be easy to use and reused in complete efficiency, consistency, and scalability to allow teams to achieve high-value digital products. The biggest leading firms invested in design systems simply because it ensures efficiency for everyone by operating like a bond of atoms creating molecules that combine into an organism guaranteeing consistency, optimization, efficiency, and scale. Many think that the obsession with Standardization and optimization that design systems work on rob designers of the ability to explore and experiment yet it surely gives the powerful the ability to Scaling work and bundle hundreds of projects and users in a short time What’s next? The future of design systems is surely bright. It is expected to elevate into more powerful design systems that tackle different challenges and give more space for designer creativity. A huge number of users are rising day by day so by the future, we might see everyone using a design system and those who are not losing all competitive advantages. Many new features will be added. Custom plugins are soon expected to keep up-to-date automatically and that teams could access the elements or components quickly without having to worry about what element to use or where to pull it from. Design systems might get customizable or even personnel and why not payable through mobile apps or even smartwatches! Truly we can not really limit expectations! Related Articles: Basics you need to know about Atomic Design System Design systems to change designer vision! Reasons that make the Design Systems inevitable The impact of design systems on adopter organizations & users Design System: beyond just UI kits Common Myths Surrounding the Design Systems in 2020 Design system team Management Advantages of Having a Design System in Your Plans Tips to create the Perfect Dark Themes in Software Design Tips to Create Blurred Backgrounds in Software UI Designs
https://medium.com/cohort-work/design-systems-history-to-present-and-future-potential-9a2529805afd
['Rania Mdimagh']
2020-12-07 06:44:50.642000+00:00
['Design Thinking', 'Design', 'History', 'Technology', 'Design Systems']
208
6 Costly Numpy Mistakes to Avoid in Python
6 Costly Numpy Mistakes to Avoid in Python Let’s learn from each others mistakes Photo by Timothy Dykes on Unsplash Numpy is one of the most central libraries to Python but we all make simple mistakes, or we even make those mistakes that we know we shouldn’t make but still haven’t really figured out how to broach the subject. I’m a pretty average programmer and even now, I still fumble around with many problems that I face in Python . So with that in mind, I decided to write down what common problems we face and how to go about them. It’s embarrassing to have lived through these: but here we go!
https://towardsdatascience.com/6-costly-numpy-mistakes-to-avoid-in-python-570ad90b9982
['Mohammad Ahmad']
2020-11-27 18:30:50.966000+00:00
['Tech', 'Python3', 'Python', 'Technology', 'Data Science']
209
Why Privacy Might Be The Hottest Product of 2021
Why Privacy Might Be The Hottest Product of 2021 It's time to value your privacy, or someone else will That is too much privacy for me- I don't like having control over my privacy - said no one ever - So why do we still easily grant companies our data without our consent? The answer is simple - because they don't want you to. The world’s most valuable resource is no longer oil, but data. You don't need to own oil or gold today to be powerful, all that's necessary are your data and the knowledge on how to control and distribute them. Why Does Privacy Matter So Much? Well, some might argue about the value of privacy. There are the privacy worshippers, who value seclusion and never post on Social Media nor seek many acquaintances, and the ones who are relatively opposite to them, called “Influencers”. They like to get exposed, share all of their moments with followers and yes, the more publicity they get, the better. Companies don't work any differently. They turn your data into their money. Bella Hadid via Shutterstock Privacy and Data breaches like the TikTok or Facebook & Cambridge Analytica breach have become an opportunity to raise awareness about the importance of privacy for people and businesses. All of our activities leave a trail of data. That means if you like to wander through the internet at 3 AM, BigTech platforms like Google or Instagram don’t sleep, they track and store all the data you went through to further analyze the behavior of their product for future deals and customers. BigTechs don't want you to know their main product is… you. They sell your data to third companies without your consent (it’s a lot of scarier, we will tackle this topic in the next article). For now just remember: If you’re not paying for the product you are the product. …BigTechs don’t want you to know their main product is… you. Cybergem’s Opinion On Data Privacy: (Gen Y is represented by a Data Strategist, Gen Z is represented by a student and tech-enthusiast) Y: As a Data Executive, I see what are the possibilities of using data to improve customer convenience and customer experience. But as a saying goes, with great power comes great responsibility and often a lot of companies do not behave responsible in terms of how they use the data about customers or data in general. As I pointed out in previous articles data literacy is in decline and customers are literally clueless about the possibilities of using the data for or against them. This might change in the upcoming years as we see more and more data scandals, leaks, which got pretty detailed media coverage. Therefore I believe that privacy will be a hot product in upcoming years, both from the customer perspective and also from the companies that want to be responsible. We see indications of this movement from BigTechs like Apple, which might sound like hypocrisy, but their foresight is on spot. Think about it, would you be willing to pay a premium, e.g. for a bank account, when you know that you will not receive any “personalized” campaign or that your data are not used at all? It is no brainer yes for me. Z: More than 40% of people worldwide feel they lack control over their personal data, according to a survey by McAfee, and 1/3 of parents have no clue on how to explain online security risks to their children. That means the whole Gen Z is at the risk of having very little power of knowledge in data literacy. As a fresh High School graduate the term Privacy vs Publicity is all around me and fellow peers- all the time. I have been struggling to find a balance between these two for as long as I can remember. What I've realized is the publicity is way cheaper in user experience than heightened privacy options. The same applies to products and services to me. If the company can't give me some level of freedom of privacy I won't use it. As the word Privacy was a hot topic to discuss at CES 2020, I am highly positive the world will hear more about Privacy issues BigTechs are experiencing. There might be some big changes coming as people/customers get more literate in the Data Privacy sphere. The best defense you can possibly do to stay “safe” from hackers and unnecessary data breaches is not posting sensitive data about you if needed. How Companies Turn Your Data Into Big Money Companies monetize your data and are making money like bandits. Google generated $41.2 billion in revenue in the Q1 of 2020 alone. You are the product that Google and Facebook monetize. Their entire revenue model falls apart if they are not able to sell your data to advertisers. via Shutterstock Not only is your data taken from you, but it is also used to influence your behavior. That's what are ADs for. They slowly but surely customize you, not the other way around as it is broadcasted to the public. “It's a gradual, slight, imperceptible change in your own behavior and perception that is the company's product” — said Jaron Lanier. How Do I Protect My Privacy? CyberGem's advice: There are some effective ways on how to protect your privacy: Check twice on what you're posting or sharing with others (so it doesn't backfire on you in the future) on what you're posting or sharing with others (so it doesn't backfire on you in the future) Encrypt your emails and messages (There's a tutorial on all of them, just click on the marked words) your emails and messages (There's a tutorial on all of them, just click on the marked words) Browse in incognito or private mode (DuckDuckGo's search engine is a great way on how to surf the internet without hidden trackers scooping up your personal information) or private mode (DuckDuckGo's search engine is a great way on how to surf the internet without hidden trackers scooping up your personal information) Read the terms of services before using a product (At least read the dumb Cookies when you enter the website- and disable as much of them as possible) before using a product (At least read the dumb Cookies when you enter the website- and disable as much of them as possible) Ask why others need your information (Last but definitely not least) Conclusion: The world’s most valuable resource is no longer oil, but data. Companies work with your data more than you might think. At the end of the day, it's your data they turn into money. Our goal is to enhance the importance of data literacy and this is one of the times we say: Treat your data delicately as you would yourself, just like you wouldn't tell your blind date your home address and credit card information just because you find them appealing. Risky Business is a great strategy sometimes but this ain't it; Don't overshare your data, don't give them away so easily. via Shutterstock Do the “boring” sometimes: read cookies or terms of service, turn off the personalized Ad experience, encrypt your messages with apps like Signal or Threema, make sure you turn off the App & Web Activity on your apps, and for God's sake- don't use the same password everywhere! Everyone should understand the basics of data privacy and learn how to protect themselves. Don’t fool yourself for thinking that data protection laws will keep you safe.
https://medium.com/carre4/why-privacy-might-be-the-hottest-product-of-2021-c2192c8011a8
[]
2020-09-17 15:00:17.912000+00:00
['Privacy', 'Data', 'Technology', 'Education', 'Social Media']
210
SpaceChain Introduces Programmable Hardware Board for Developing Blockchain Applications that can be Deployed in Space
SpaceChain Introduces Programmable Hardware Board for Developing Blockchain Applications that can be Deployed in Space SpaceChain Foundation Follow Apr 13 · 3 min read The SpaceChain Callisto development kit enables developer communities worldwide to participate in advancing next-generation decentralized infrastructures for blockchain and fintech applications using space technologies SINGAPORE — April 14, 2021 — SpaceChain today introduced SpaceChain Callisto — its first open-source demo hardware board designed for developing blockchain applications that can be used and deployed in space. SpaceChain Callisto is pre-installed with and runs on Linux and SpaceChain Operating System (SPC OS), to help accelerate space technology development, and serves as the backbone for SpaceChain’s payload launches and missions. SpaceChain Callisto enables developer communities worldwide to participate in building out next-generation decentralized infrastructures for blockchain and fintech applications using space technologies. More importantly, it signifies a strategic step forward towards democratizing the space industry and fostering the commercialization of outer space. The Callisto is configured in a way similar to the on-orbit payload currently installed in the International Space Station. Developers will have the chance to create game-changing applications that can potentially leverage blockchain-related functions, including running smart contracts and performing multi-signature transactions, and running tests through their computers to determine whether the applications they created would work in space ultimately. “The world is entering an age of software-defined satellites that can be configured to perform different tasks by simply uploading an application or program, much akin to a mobile phone,” said Zee Zheng, SpaceChain co-founder and CEO. “By opening up access to space and satellite technologies through increasing the number of players and contributors, more opportunities for collaborative work can be created and lead to new businesses and socio-economic models that were once impossible.” SpaceChain’s co-founder and CTO Jeff Garzik executed the first multisignature blockchain transaction in space in August 2020. Less than a year later, SpaceChain is yet another step closer to removing barriers and allowing a global community to access and collaborate in space, all while remaining secure and immutable through proven blockchain cryptography. “We named our product — Callisto — after the second largest moon or natural satellite of Jupiter, to represent how a mass that is millions of kilometers from Earth can be held within our grasp,” said Jeff Garzik, SpaceChain co-founder and CTO. “SpaceChain’s open-source demo hardware board is proof of our efforts to bring our disruptive technology to the masses as we continue to discover more commercial use cases for blockchain-based satellite networks in space. We are one step closer to making the SpaceChain OS available to anyone, anywhere in the world.” More information about getting started with SpaceChain Callisto and how to use the board can be found at https://spacechain.com/callisto/ and https://github.com/spacechain/SpaceChain-Board. Please find below the product thumbnail of the SpaceChain Callisto board: High-res images can be found here.
https://medium.com/blogspacechain/spacechain-introduces-programmable-hardware-board-for-developing-blockchain-applications-that-can-f92c3d96ba4c
['Spacechain Foundation']
2021-04-13 02:36:00.890000+00:00
['Space', 'Articles', 'Hardware', 'Cryptocurrency', 'Technology']
211
Technology: Apple Products
Apple Products: Wat makes them Special? This year, Apple launched its newest iPhone SE model which has a resolution of 750 pixels by 1334 pixels and a 4.70-inch touch screen display. There are people who will run into queues to get themselves dis new phone but the reason I write about dis coz I kept seeing friends putting down their old phones to buy dis new product from Apple. It may sound ridiculous that someone would dash a phone just to be among the first to get the phone. As a technological researcher, the Apple product is something I admire and respect among other brand products. The numerous fans waiting to promote and buy any Apple product until nothing is left. These fans will buy everything no matter the price tag or cost involved. They are known and accepted worldwide and still tops several brand products around the world. Wow! IOS Apple How are they achieving this over the years now? Trust they know the market preference They have a consistent plan Understand their purpose Innovate their products They are flexible They lift their customer Status They invent new ideas I did a quick data poll to find out why Apple has more customers, and the outcome includes; they provide much more quality devices and their products just deliver perfect results. In terms of functionality, Apple products are very lucrative and user friendly. The latest from Apple is to design a 5G network phones, using the highest network speed and performance. Apple is non widely to produce easy to use products as technology gradually advance; Apple is focused on creating user-friendly products and gadgets for their users. Apple is a company that always put their workers need first and to design the preferred product. If there’s an empirical thought, one thing that Apple is does often is, t has a consistent brand. dis can change as time moves on and I can tell their changes won’t affect r differ from their trends. Apple doesn’t spend more money advertising for new customers in many countries but they sell more than you can imagine, dis means, they have the trust of the public and you could be watching an Apple advert and foretell what it is before they show the product. All these shows how customers are following Apple, you’ll know that it’s Apple without asking whenever you set your eye on any of their product. These significant trends run throughout every channel they have. You can expect the same customer experience when you are visiting their website as you do if you were visiting one of their stores. Their clean modern intentions are easily non and seen across everything they do, dis rally helps people become familiar with them, no matter which channel you’re choosing to connect with them. Apple is doing very well and several other phone companies are learning and copying from their brand style. A company has to be evolving; dis will encourage success in various competitions. Apple will always be flexible and grow wif time but will not be difficult to understand and use. They will always bring something new and special. They choose to make and design products for the market, ensure the product is simply good, and satisfy all users. They do not stick to old ways and fashions but evolve with time and flexibility in product development. There are several researchers and strategic workers who continuously plan to develop new things for the market and users. Apple is famous for: Computer products Phones Television Music Designs among others Apple enhances user status and prestige, people feel big and proud whenever they own an Apple product, making the users feel like they are better and successful in life. The description and information they provide are very powerful and enticing to the human way of life; it drives growth, improvement, passion, style, and success. Followers look up to them for exclusive marketing strategies and businesses receive motivations from them. Whenever you are seen using Apple products, you indirectly share and represent their success and ideals. IOS When you see the comments and feedback from users of Apple, one single pinpoint is a powerful design and product. The features work so well and their performance exceeds imagination. Every image and air adverts show the sleek design of their products. It’s all about telling a powerful story that will make your audience want to be a part of everything you do. If you think you can or think the opposite, you will decide on your option. In the world of business, there are several controversies, but Apple does not rely on individual guesses and sayings, they focus on making enough research and testing before creating any new product. They have a group of researchers who dedicates to understanding innovations and future plans. Their designers and engineers work as a team wif self satisfactory intentions to produce goods they will want themselves. The product is made in a way that, users can not live without, and produces will also have much interest to use the product often. Behavior of Apple There are so many channels and directions for Apple to hear from customers and commune wif them on new innovations coming up. They are really designing enough hardware and business opportunities for people around the world. They recruit people from different countries and come up wif fast and slow approaches to meet a target. Some plans take years to finish while others take days or months, their behavior is directed to making technology free and better for all. Difficult tasks that may seem impossible to some technology companies are being worked on by Apple innovators and researchers. They do not rash to develop new products, they spend time trying to perfect their products to make sure that they hit the mark. Apple scrap and remodel many products, not all phones from Apple comes out, this shows the years they spend on new ongoing products before they are certain to release them to the public. To deliver very good work as a producer and business organization, you need to make time to do so.
https://medium.com/@jonesahiati/technology-apple-products-fddb1e5c6f3a
[]
2020-12-01 14:20:26.502000+00:00
['Academia', 'Technology', 'Apple', 'International', 'Fashion']
212
Tech Must ‘Get Uncomfortable’ With Its Impact on Society: An Interview With Swati Mylavarapu
OneZero is partnering with the Big Technology Podcast from Alex Kantrowitz to bring readers exclusive access to interview transcripts with notable figures in and around the tech industry. This week, we’re joined by Swati Mylavarapu, a founder at Incite. This interview has been edited for length and clarity. To subscribe to the podcast and hear the interview for yourself, you can check it out on Apple Podcasts, Spotify, and Overcast. Swati Mylavarapu is a tech investor and activist who spent $2 million in the 2020 election cycle on Democratic causes, in partnership with her husband, Nest co-founder Matt Rogers. Mylavarapu isn’t your typical Silicon Valley investor. She’ll explicitly admit that the tech industry has some culpability in the hollowing out of the middle of our economy, delivering wealth to the few while leaving the rest in a tough spot. She also served as Pete Buttigieg's national finance chair in the 2020 Democratic primaries, playing a key role in his surprising upstart campaign. Mylavarapu joined the Big Technology Podcast fresh off a bout with Covid-19 to discuss the tech industry’s role in our society, and how it can be a force for good moving forward. Kantrowitz: We were actually scheduled to record last week. Do you want to fill us in on what your past couple of weeks have been like? Mylavarapu: Yeah, it’s actually why I am literally excited to be here. My family and I contracted Covid, so we were grappling with the realities of that for most of the last two weeks, including over Election Day. But thankfully, relatively mild symptoms in our house, and we are on the mend and very lucky to have had access to great medical care, which is something that I wish every family had access to. And it was also such a reminder because we’ve been extremely careful, maintaining a strict quarantine, lockdown, not traveling, and it still managed to come into our house. It’s a reminder that this thing is very real and we need a better coordinative response out there than what we’ve got at the moment. You’re the first guest that we’ve had on that’s had it. I’m glad that you’ve recovered and are doing well. Thanks. We’re also really lucky to be in San Francisco because the city’s gone above and beyond to make testing super accessible, so that helped us. So, to start, there are people that call you a Silicon Valley power broker. I know you don’t like the term, but why do you think people say that? Oh, I think that’s an awful term. I suppose people say that because I do a fair amount of work really at the intersection of supporting breakout leaders, and increasingly that’s been in the political realm, not just in the startup and venture realm. And I think that I am too rare a bird in Silicon Valley. But part of what I’m excited to talk to you about is why I think it’s important that more of us do this kind of work like we’re doing at Incite to invest in early stage political leaders as well as early stage startup and technology builders. We spoke for a story a few weeks ago, and you seemed willing to say the way that tech accumulates wealth is part of the reason why have such a disconnect in the country right now. We’re living in this moment in America where the idea of market fundamentalism and unfettered capitalism is increasingly in question, and that is not just led from one political party. You see people on the conservative side, like Marco Rubio and Josh Hawley, starting to call for more conscious capitalism, to folks on the super progressive side as well. Tech is very much caught in the crosshairs of all of these conversations. And I think the best way for us to grapple with that is to start grappling with these hard questions about what we’re building and why. And [asking] who it benefits; where it concentrates power, and who loses out as a result of it. Because if we’re not having those conversations, the world outside of our industry is having them for us. Silicon Valley seems to have built a lot of platforms that end up accruing wealth to the few and taking opportunity from the many. One example is TurboTax. There used to be this whole class of accountants who would make a nice living doing families’ taxes, but TurboTax came around, it was more efficient, but it harmed this important middle of the economy. Do you see this as a problem? It’s one of the biggest problems of our time. And it’s not something that sits squarely on the shoulders of technology companies in our industry, but it’s something that we very actively play a role in. Can you elaborate on that a little bit? Sure. Well, you’ve given a great example of what this might look like if you are, say, building an enterprise software company. It’s equally true if you are building a delivery service company, and you are starting to make decisions about who are stakeholders in your business versus not and what is the relative value that a driver or delivery person participating on your platform has or not. It’s also something even more fundamental. Mary Gray is an anthropologist who was recently given the MacArthur Genius prize, and her research over the last couple of years has done groundbreaking work to talk about what she calls “ghost work.” And the idea that any major technology platform in the modern age—whether you’re Amazon and what they’ve done with Mechanical Turk to a gig-economy company or any large company like Facebook that’s using a huge distributive global workforce—what they have done to create and underclass, a global underclass around the world. And some of these might seem like fringe ideas, but I think we’re starting to see the credence that this notion of class and caste is starting to take on in public conversations about what’s happening in America, and it’s extremely relevant to what we build in the tech industry and these forces of power and earning asymmetry and the divide between capital and labor and how tech feeds into it. Going back to the TurboTax example, are we going to end up in a society where we do have a small percentage of people who have the wealth and everybody else struggling to get by? We have a lot of political unrest in this country, in large part because people feel that the system has left them maybe one expense away from economic catastrophe. Populism is in some ways a really powerful lens through which to view what’s happening in the United States at this moment, and if you think about technology as a way of supercharging some of the forces that are giving rise to populism, it can be really telling. I spent a lot of time last year in places like Iowa, where Donald Trump is about as popular as Bernie Sanders, and that should tell us something. It’s not this red versus blue, good versus evil, the predominant cleavage for a growing number of Americans is around who gets access to having and who doesn’t. So it’s interesting, you’re posing this as a question of technology building software that’s kind of inverse Robin Hood and taking from the folks that can least afford it and accumulating wealth. I think it’s a little bit more complicated than that, it’s not just that certain stakeholders are getting to accumulate disproportionate amounts of money. It’s also what we’re taking away from so many people that provided for their family, [who] were able to put bread on the table, and buying the American dream of their children and their children’s children having a better life than their own. We are stripping that away, and we’re not offering a viable alternative. So I guess what you are pointing at is this is something that the political system needs to address? No, I don’t think this is something that sits just on the political system. I think that this is also a reality that the tech industry has to grapple with and ask questions around. At some point, these companies that we’re building, what is the point of them if they are not meaningfully improving the quality of life for people? Do you think people actually ask that question? I think we’re starting to more and more. And I think in our earliest days, those were the questions that gave rise to the tech industry as we currently know it. Wherever those intentions have led us, and I believe at our core we are good and we do hard things not just to make a lot of money but because they make the world better. And so what I’d love to see is a wider spread, deeper reckoning with some of those core, values-based questions. These days, I think our industry would be better if we talked as much about our leaders’ values and the real-world value that our companies are creating as we do about our valuations. So often when I speak with tech leaders, they talk about people who are against technological changes being similar to the people who were against the horse and buggy moving to the car. Should we start discussing — and why have so few people started to ask — how tech products are leading a division between the economic haves and have nots? Yes, we should be having those conversations. Those conversations are happening, they are happening today right now, whether or not we acknowledge that they are. So the question for my peers and my colleagues is: Do they want to be part of those conversations? Obviously there’s somebody in the White House who calls into question foundational science and the advance of modern thinking when it benefitsed him. And he is just the tip of the iceberg. There is a growing movement of conservatives and elected officials in this country that do the same. And so too, on the other side of the political spectrum, there’s a growing conversation around [the question]: What is the purpose of capitalism? What is the purpose of technology and innovation if it is not fundamentally improving the lives of everyday people? This movement’s getting a term, “conscious capitalism,” and I think there’s a lot of credence to it. And the more those of us that are here investing in future technologies, building the companies of the future, choose to participate in it, I think the better served we are in terms of what we find is investment-worthy. But also in the kinds of problems that we decide to take on and solve with the companies that we build. For example, should more venture capital go into technologies and companies that are addressing our climate crisis? We make that choice in the allocation of capital in the founders and companies that we back alongside wanting to generate a top-notch return. It’s not clear that we have to choose between building great businesses and making a ton of money and solving big hard problems that are worth solving. What is going on behind closed doors when you speak with people in the tech industry about the impacts of this wealth consolidation are? The tech industry is a monolith. You know that better than most—it’s showcased in the diversity of viewpoints you bring on your podcasts and you cover in your stories, but I think that there is a growing concern. This year saw two concurrent things that are really interesting to me. One, there’s a growing popular tech-lash, if you will, across most of America outside of Silicon Valley that calls into question the motivations, goals, and unquestioned utility of what’s happening in our industry. But we also saw an unprecedented number of people from inside the tech industry become politically and civically engaged. How many of our colleagues and friends were motivated to do something around this year’s election? And so what that tells us is we’re an industry full of individuals that care deeply about the future direction of America, about doing good in the world, about solving hard problems that matter. But somehow that has come a little bit unplugged from what the ramifications of our platforms and our businesses have had. There’s reason for optimism in that. I think we can get out there and rectify things and take more ownership and have these harder conversations. It’s something that I push the folks in our Incite portfolio to do all the time. We start at the very beginning, by looking for founders that want to build big, hard problems and values-driven businesses, and we drive them and help create platforms for them to have conversations about some of these hard questions around what they’re building. I’d love to see us create more space for those kinds of conversation from more leaders across our industry. You studied with Pete Buttigieg and ended up being his lead fundraiser here in Silicon Valley. I’m going to say this with a caveat, I’m not a Bernie Sanders voter, but looking at the message it seems to me that you might have gravitated toward him. So why Pete Buttigieg? The way that you phrased your question, Alex, is in part my answer. So much of Pete’s magic is his ability to have very direct conversations with voters and talk to them openly about ideas that they might have pre-assumed to be too extreme or too radical but to make them palatable. I’ll give you an example: At the beginning of the primary last year, Pete was the candidate out there in the Democratic presidential primary calling into question why the number of justices that we have on the Supreme Court was a fixed number and in fact, opening up the question of maybe there’s future court reform that calls in the question the size of the court. Now wherever you are, and that is a potential solution, it became a mainstream idea in the run up to the election two weeks ago, something that everybody across the spectrum was talking about. It was there on Fox News as much as it was discussed on CNN or C-Span. And that is so much of Pete’s magic and his talent—to position ideas that might seem too far extreme or too futuristic and to make them seem approachable and palatable and to make them seem relevant in the current political moment. It’s a skill that he has in spades, it’s also something that I think is true of new generation political leaders. So while you’re right, I’ve known Pete for half of our lives, and he asked me, he gave me the opportunity to work on his presidential campaign. It’s also something that I’ve seen in the deep political work that I’ve done for the last four years. Right after the 2016 election, I helped build a program called the Arena, which at this point has trained a few thousand young Americans around the country who are incredibly diverse to be first time candidates and staffers. And this attribute, this ability to talk past partisanship and to speak to core values and to connect with voters on both sides of the political spectrum, is a skill that we see in all of these graduates coming out of the Arena too. I think it’s an indicator of the direction our politics are headed in. Court-packing was something the more progressive wing of the Democratic Party embraced, but it wasn’t something really geared toward fixing the economic system. And when people thought of Buttigieg, I don’t think that’s really what they thought of. In fact, the tech industry’s support of him was that he was kind of a safe candidate that would ensure the economic systems that have made tech folks wealthy would stay in place. What do you think about that? I don’t know, he came out here and he protested with Uber drivers; he was in support of AB5. He was out there very openly saying that social media platforms had gotten away from our democracy and the institutions that we respected and that those were all things that needed to be open to review. So, were the positions as extreme maybe as Bernie Sanders’ economic positions? Not necessarily. But I think that they had a degree of pragmatism to them that also made them approachable, and there was a willingness to put everything out there. And I’ll say, I think part of the reason why folks on the West Coast gravitated to Pete, especially in the tech industry, is because we have this thing, we understand what it is when young people step up to the plate and try to do big, hard things. And I think we have unique appreciation for that. So that’s part of what we saw reflected in the support that came out of Silicon Valley for Pete. I don’t want to re-litigate the whole 2020 Democratic primary, but I think that the folks who listen to this might be skeptical that Silicon Valley actually wants real change in the political system. This is not an anti-tech industry podcast, but there’s definitely a feeling that there’s a reticence to tackle some of the real troubling aspects of our society head-on. So where do you stand on that? For sure. Yeah, and so let me be very clear, this is not a normative position. I want to be very descriptive of what I see happening as a student of the social sciences and history; change is coming for the technology industry. The question is how actively our industry is going to participate in that conversation and in what ways. Say more about that. I just think you can see it in the tea leaves of what is happening in different corners of our country and the way that is starting to gain steam. The fact that populism is becoming a major theme in our politics, that there are more and more Americans beginning to question the technology industry, who are beginning to sort of wake up and realize that actually, democracy is more fundamentally American than capitalism, that the notion of unfettered market fundamentalism and businesses seeking profits for profit’ sake. You can either be a turtle and stick your head back in the shell and hope that it just passes by, or we can get out here and actually listen, learn, participate, and demonstrate that there are better ways to build a more conscious kind of business that solves hard problems, satiates shareholders, and brings all of the stakeholders like our employees, our labor pools, the consumers that we rely on, along with us. Could you see people actually in this world gravitating toward that message? For sure I can. It’s part of the reason why we built Incite and why we look to invest in founders that lead with their values as well as their desire to create value. We’ve got a portfolio now. In the last four years, we’ve made 60 investments of companies that are building important climate technologies, cancer therapeutics companies that have pivoted into COVID treatment development firms. We’ve got companies that are led by incredibly diverse teams that are making maternal health care more accessible to more people in America, that are building really amazing high return likelihood businesses but are also doing it in a conscious way. And if their businesses succeed, their stakeholders will succeed, as will their investors. So I’m in this business because I believe, I know it to be possible. It seems like, at Incite, you’re looking for mission-driven or founders that want to heal the world, make things better. I don’t know what we call it these days, but we invest in good people that want to solve big problems and make some money along the way. Doesn’t everybody in the tech world feel like they’re a good person who wants to solve a problem in a good way and make some money along the way? I hope so. I want to believe that most of us in tech are good people. But again, you’ve got to be willing to have the hard questions about what you’re building and why and in what ways does it actually benefit people. I think for a long time, the first few years after I started Incite, I could tell some people got it and some people really didn’t because you’d start to talk about values and doing good and it would make some folks really uncomfortable. Like “Oh there is no data around that.” Or “Well what’s good or what’s bad?” This notion of false neutrality. No, you’ve got to have the conversations . Right, once you bring that up then you can start to see who’s actually in it and who actually just had a slide in their PowerPoint deck that we’re going to improve the world by doing X. Yep. So you’re trying to help with the fund and with the political activism, what would you say is more impactful? They’re both really impactful, and at their core, they are very similar. It’s the willingness to take early bets on good people that are getting out there to solve really hard problems, but we do it with an awareness that in 2020, big, hard problems aren’t just things that startup companies can fix. Sometimes they require really talented leaders in our politics, sometimes they require new nonprofits to advocate for whole new areas of investment. So we try to be flexible and nimble in the form that solution can take. Okay, let’s end with this: What are some of the key policies that you would like to see implemented over the next four years? Now, I know we’re going to have divided government most likely, so the chance of anything getting done—if the last four years are any indication—is little. But if you had a dream set of policies that we would implement, what would they be? Well for one, I’d like to see us get this pandemic under control because I think there is no economic recovery. We can’t even really start a conversation about economic recovery until we figure out how we’re going to get this virus in control. And there are too many Americans whose health but also livelihoods depend on better leadership from the top. So that’s the first thing that I hope we get there on, and I think that’s such a huge opportunity for the tech and innovation community to step up to the plate and play a part in it. Because we know how vital science and scientific and tech breakthroughs are going to be in the development of treatment and the widespread accessibility of it. So that’s a big one, but the other thing is I want us to have a set of forward-looking policies that really look at getting the economy working for more Americans. It’s a lot of the themes that you and I have spoken about today, Alex, but I think we’ve gotten the opportunity with the new administration to focus on some of these nonpartisan conversations around how we get the American economy working for more Americans. And maybe that starts with things like student loan forgiveness and an economic stimulus and relief for families and small business owners across the country. But forward-looking, I think it’s got to be a lot more than that and really look at the breakdown of capital versus labor in this country and who has access to those two things. And just looking at the composition of the transition team for the Biden-Harris administration and how incredibly diverse it is in gender, in race but also in thinking [and] socioeconomic background. I’m really hopeful, I think this could be one of the boldest periods of leadership on economic issues that America’s seen in a long time. Okay, so help end the coronavirus and then think about student loan forgiveness—is there anything else that you’d be interested in pursuing? Student loan forgiveness would be part of just a much broader-based economic reimagining. So I think that could include a host of things, including reinvestment in underprivileged communities, focusing on advancing home ownership for more parts of the country, focusing on minimum wage and what that looks like. We saw some important advances with the ballot box around those questions two weeks ago, but I think there’s more to be done under federal leadership there. And then for tech companies out there, if there’s a few things that they could do differently, what would you recommend? Focus on their core products and platforms and ask those hard questions about how their products and platforms are supporting or undermining democracy, and who they’re working for and why, and how we ensure that they work for more people. That’s a big part of it. And then the second is to look internally because it’s not just what our companies build, it’s also how we build them. So these conversations that we’ve been having for a while around the diversity, or apparent lack thereof in our boardrooms, in our management teams, and our employee bases, it’s a really important moment to be asking those questions. These are the things that we have control over—what we build and how we build it—so I’d love to see more of us asking these kinds of hard questions and pushing ourselves to choose between right and wrong. Get in there, and get uncomfortable.
https://onezero.medium.com/tech-must-get-uncomfortable-with-its-impact-on-society-an-interview-with-swati-mylavarapu-21e27d541296
['Alex Kantrowitz']
2020-11-18 19:06:05.383000+00:00
['Big Technology', 'Society', 'Activism', 'Technology', 'Silicon Valley']
213
Node Event Emitters — For Beginners and Experts
For now, we’ll only pay attention to these two member functions: on(eventName, …) emit(eventName, …) To publish an event, we use the emit() function, and to listen to an event, we use the on() function. In EventEmitters , we publish and listen to the events by name. In the last code snippet, where we created an EventEmitter object, we use the following code to raise events and listen to them. Running the above code snippet prints the following output in the console: > Data Received In the above code snippet, we raised an event with the name myEvent on the last line, and we had a listener registered to the event just above the line of the publishing event. At the time of publishing the event, there must be an EventEmitter listener existing to listen to the published event. For example, if we change the above code snippet to … … the output will be: > Listener 1
https://medium.com/developers-arena/nodejs-event-emitters-for-beginners-and-for-experts-591e3368fdd2
['Kunal Tandon']
2020-02-16 17:28:11.395000+00:00
['Programming', 'Technology', 'JavaScript', 'Software Development', 'Nodejs']
214
The Movers & Shakers of InsurTech:
Emerging Trends in Data and Insurance Underwriting Since the advent of the modern industrial era, the most successful enterprises have leveraged spatial analytics, or data on the space and workstreams of our physical world, to upend industries. In the early 20th century it was Ford using data points to find a better way to manufacture high quality and low-cost cars — this was only refined by Toyota’s TPS system later in the 20th century. Now well into the 21st century, just what vertical applications are best suited to take up the baton and execute on the opportunity? Immediately FinTech comes to mind based on the number of platforms utilizing data sets of all types to underwrite risk. Whether it be commercial or consumer loans or insurance policies, the more data a business has to assess risk, the more efficiently the platform can price its products, attract customers, and reduce write-offs. Within FinTech, insurance as a $5T global category has already seen traction in applying spatial data to practice. While data geeks have dominated the insurance industry for decades, the diversity of data sets used in incumbent models is surprisingly Neolithic. For instance, traditional underwriting algorithms only take into account a few dozen data points relative to the treasure trove of real time sets of spatial data around us. That said, new models are emerging to augment underwriting with data from the physical world, a notable example of which is Root Insurance, a full stack carrier employing dynamic driving behaviors to underwrite policies. Though founded just five years ago, the company recently IPO’d at a multi-billion-dollar valuation, largely attributable to growth associated with its novel underwriting capabilities. Root is not alone in this practice either as incumbents and emerging InsurTech platforms across lines are now unlocking billions in value via enhanced underwriting tools augmented with data sets from the physical world. The value of this type of information is even more topical when considering the substantive blind spots of legacy insurer underwriting models. Take that Consumer Reports findings suggest certain insurers charge minorities premiums as much as 30% higher than their white counterparts in areas with similar accident costs. Despite the historical correlation of zip codes to loss ratios, this underscores the importance of leveraging additional data points for a more holistic understanding of an individual’s and/or property’s risk profile. All that said, how is the category evolving to address these blind spots? For the purposes of this discussion, the application of spatial data, be that via wearables or satellite imagery, can largely be segmented under two major umbrellas within the broader insurance sector — Property & Casualty and Health & Life lines. Beyond how to extract spatial data, though, is the question of how to best apply it in dynamic underwriting. Health & Life Insurance (“H&L”) Relative to other insurance lines, Health & Life lines are further along in their integration of spatial data into underwriting models. Historically, H&L insurance policies have been largely measured by examining medical histories of applicants. While this information is an important part of the picture, it does not accurately capture a full risk profile and it certainly lacks any real time data points when considering the average American sees a doctor of any kind just four times a year. To fix this gap, wearables have emerged as a natural source of real time updates for dynamic underwriting, most recently on the heels of the boom in fitness tracking devices such as Fitbit and Apple watches. Incumbent insurance players are moving fast to adapt too; in 2018, John Hancock announced it would stop selling traditional life insurance and instead only market interactive policies that record the exercise and data health of its customers through wearables. United Healthcare followed suite by adding the Apple Watch to its United Healthcare Motion program. Munich Re even recently released a study on the effectiveness of physical activity as measured by wearable sensors in profiling mortality risk of a U.S. population-based dataset. This is most important in the context of cost-containing, preventative treatment where an estimated 60% of high-spend members under coverage weren’t high cost the prior year. As a part of this study, Munich Re also uncovered that in addition to real time tracking and optimized insurance pricing, it also benefitted from continuous engagement via wearable devices as well as expanded insurability for underrepresented groups who otherwise would have been declined. All of this underpins the rapid growth of real time data sets with IDC estimating there will be 41.6 billion total IoT devices worldwide by 2025, enabling new channels of communication, engagement, and information sharing in the process. With the game changing health insights coming from spatial data, the value attributed to the global wearable healthcare device market, which is expected to surpass $29 billion by 2026, should not be surprising. Property & Casualty Insurance (“P&C”) Within the broader $1.5T Property & Casualty market, Auto lines have already ramped up use of spatial data via telematics and usage-based insurance for the purposes of risk modeling. In addition to underwriting, though, dynamic data also enhances other areas of the insurance value chain. Take for instance first notice of loss (“FNOL”) — while historically FNOL has required a slew of forms, phone calls, and from personal experience, tears, the filing process is being transformed with real time information extracted from IoT sensors, drones, and satellites. Case in point is CSAA Insurance’s partnership with Owlcam to send videos to a driver’s mobile phone when a car crashes or is broken into. Because of the availability of real time information facilitated via satellite, all parties now benefit from both a living communication channel and a reliable source of truth in determining loss outcomes. Beyond Auto though, P&C also encompasses the enormous opportunity in Property lines which counts the industrial, commercial, and residential asset classes within its ranks. Like Health lines, adoption of spatial data for Property is quite verticalized with many technologies built with specific applications in mind. For instance, Cape Analytics, a software platform using aerial analysis via satellite imagery, focuses on evaluating roofing conditions such as size and age for home risk modeling. Along these lines, startup True Flood provides insurers with property level information on homes throughout a flood plain to evaluate structural durability of say a home’s foundation. Looking forward, many startups are now emerging to address global warming with preemptive, risk assessing technologies to manage growing unpredictability of property damages. These technologies are relatively nascent, particularly when considering the long sales cycles of frontier technologies into incumbent insurers and reinsurers. Now to break this all down. With a massive market opportunity at hand, which innovative solutions are capitalizing on spatial data applications in insurance underwriting? To help navigate, the below graphic maps out select companies providing both the infrastructure needed to extract, organize, and analyze spatial data, including wearables and satellite-based aggregators and the InsurTechs utilizing this data for underwriting. Note, the chart is segmented such that applicable lines of insurance are distributed horizontally. A takeaway here is that while the most familiar names in InsurTech such as Lemonade, Root, and Oscar function as full stack carriers, a sizable opportunity also exists in the infrastructure supporting them. As full stack and MGA models utilizing these data sets have scaled, so has the visibility of spatial data used in underwriting, and incumbents have moved to quickly catch up via investments, partnerships, and acquisitions. Over the last few years alone, American Family’s investment in Avinew and Teraki and State Farm’s investment in Cape Analytics speak to the emerging opportunities across lines on the P&C side. Meanwhile for H&L lines, investment has largely been focused on technologies targeting specific ailments or demographics such as New York Life’s investment in Carrot or TransAmerica’s investment in 100Plus. The largest outcomes, though, will go beyond specific applications and provide the most versatile infrastructure to augment risk modeling across lines. With increasing availability of spatial data through IoT technologies, there will emerge winners and losers across industries. However, in InsurTech, the challenge lies not in the collecting of the data, or how this data can be useful, but rather which sets of spatial data will be most effective in proving out risk. In a multivariate equation, only time and investment will help discern where the highest impact data points by line lie. That said, to ignore the opportunity in spatial data would be to start a lap behind competitors in a land grab for market share. And with customer acquisition costs only increasing across the insurance landscape, the ROI of data that provides customization, pricing efficiency, and increasing engagement with customers will only become more important in driving winning outcomes. Disclaimer: Views are my own and may not reflect not those of my employer. Sources: McKinsey’s “State of property & casualty insurance 2020” Consumer Report’s “Minority Neighborhoods Pay Higher Car Insurance Premiums Than White Areas With the Same Risk” Institute and Faculty of Actuaries’ “Wearables and the internet of things: considerations for the life and health insurance industry” by A. Spender*, C. Bullen*, L. Altmann-Richer, J. Cripps, R. Duffy C. Falkous, M. Farrell, T. Horn, J. Wigzell and W. Yeap S&P Global Market Intelligence Munich Re’s The Future Is Now: Wearables for Insurance Risk Assessment authored by June Quah Transparency Market Research (TMR)
https://medium.com/@davidmullen88/the-movers-shakers-of-insurtech-769d750e0f46
['Dave Mullen']
2020-12-17 19:05:58.639000+00:00
['Big Data', 'GIS', 'Insurance', 'Venture Capital', 'Technology']
215
How NIST Helped Hero Pilot Jimmy Doolittle Fly
Jimmy Doolittle in the aircraft used for the first blind landing in 1929. Credit: National Air and Space Museum, Smithsonian Institution (reprinted with permission) Laura Ost, Public Affairs Specialist, National Institute of Standards and Technology (NIST) As portrayed by movie star Spencer Tracy in Thirty Seconds Over Tokyo, pilot James (Jimmy) Doolittle led the 1942 air raid on Japan that lifted American spirits in early World War II, winning fame and the Medal of Honor. He eventually became a four-star general. Long before Doolittle became a legend, the National Bureau of Standards, now the National Institute of Standards and Technology (NIST), helped the young lieutenant achieve another feat of aircraft derring-do. In the early years of aviation, flying was experimental and tricky, and often dangerous. Many flight instruments and navigational aids were developed, but in the late 1920s pilots still could not know their exact position in the air. They had to guess by looking outside the plane at landmarks and the ground, a potentially life-threatening challenge when landing. Pilots simply could not fly in fog. Pilots somehow needed to get accurate 3D guidance to know their aircrafts’ height and distance from the airfield, line up with the runway, and descend gradually for a safe landing. NIST’s contributions to the development of radio led to research on radio aids to air navigation. Among other innovations, NIST researchers designed runway beacons to transmit radio signals that could be picked up by a plane’s radiotelephone receiver. The system mapped out routes that a pilot could follow by looking at visual indicators in the cockpit. NIST contributed this system to 1929 fog landing experiments arranged by the Guggenheim Fund for the Promotion of Aeronautics. NIST’s radio range localizing beacon, to indicate runway direction, and marker beacon, to indicate longitudinal position along the runway approach, were installed at Mitchel Field in Garden City, Long Island, New York. The cockpit instrument display was a set of two metal reeds. When the plane was on course, the two reeds appeared to be the same length. When the plane was too far right or left, the reed on that side was longer, and the pilot would turn the plane toward the shorter reed. Aircraft course indicators using two metal reeds, 1928. Inscription (left) reads: Longest reed shows side off course. Inscription (right) reads: Aircraft visual course indicator for directive radio beacon. Credit: NIST Digital Archives On September 24, 1929, Lt. Doolittle of the U.S. Army Air Corps, sitting under a cockpit hood to prevent him from seeing outside the NY-2 Husky aircraft, performed a series of flights and landings, including several in heavy fog. A safety officer, hoodless in the plane’s forward cockpit, provided backup control. This first “blind” takeoff, 15-minute flight around the airfield, and landing — all by instruments alone — is commemorated by an Institute of Electrical and Electronics Engineers — American Institute of Aeronautics and Astronautics Milestone plaque at the Cradle of Aviation Museum in Garden City. “Doolittle’s successful blind flight and landing demonstrated that having and being able to use accurate and reliable instruments was the key to safe flying, under near-zero visibility conditions … contrary to the belief of many pilots at the time that being able to fly, ‘by the seat of my pants,’ was the more important skill,” the online citation states. “Crucial to the success of the flight … was the radio range and marker beacon developed by the Bureau of Standards,” the citation notes. Additional technical details can be found in the milestone proposal. Doolittle used then-standard cockpit instruments and several other, newer ones, including an artificial horizon display, a directional gyroscope, and an altimeter that could be corrected for changes in barometric pressure based on two-way radio communication with the ground. The altimeter’s measurements of height above the ground were off by a probable 9 to 12 meters (30 to 40 feet), resulting in hard landings, according to a 1930 NIST paper. “The lessons learned from these demonstrations were many,” NIST researchers wrote. “Perhaps the most important was that the problem of securing suitable indications of the true height above ground still required attention. The need for two-way communication with the ground in order to correct the altimeter to the proper barometric pressure seemed excessive. In addition, the absolute height could not be determined to the proper accuracy.” Physicist Francis Dunmore operates a model illustrating the radio beacon system used in guiding aircraft, 1926. Credit: NIST Digital Archives Doolittle’s demonstrations thus pointed to the need for another landing aid. In separate experiments, NIST researchers were working on a solution: a sloping radio beam, a sort of invisible ramp, to provide a more exact landing path. This transmitter radiated a directional signal pattern with an elongated horizontal lobe. The signals received in the aircraft were converted to an electric current indicator, which the pilot would try to keep at a fixed value while flying along the axis of the lobe or the line of maximum signal intensity. Audible signals indicated the plane’s approximate distance from the airfield. There you have it: NIST’s radio beacon and receiving system to enable the blind landing of aircraft under conditions of no visibility. The system had three elements: a directive beacon, a marker beacon and, finally, a radio landing beam. More details are available in various books and other publications. After a series of test flights, on September 5, 1931, Marshall Boggs, a U.S. Department of Commerce pilot, made the first “completely blind” landing in the history of aviation using only radio signals for guidance. This historic flight at the College Park, Maryland, airfield launched a new era in aviation. Later known as the instrument landing system, NIST’s radiobeacon scheme was the forerunner of today’s air traffic control.
https://nist.medium.com/how-nist-helped-hero-pilot-jimmy-doolittle-fly-5ef630581f2f
['National Institute Of Standards']
2020-08-19 20:38:44.983000+00:00
['History Of Aviation', 'Aviation', 'History Of Technology']
216
Top MedTech Startups:
Medical Tech Outlook The role and value of medical technology is increasing by leaps and bounds as healthcare industry is rapidly moving toward digital transformation. The healthcare sector is taking advantage of advanced technologies to improve and broaden healthcare services.With the help of AI, 3D printing, and automation, medical researchers and entrepreneurs are making major breakthroughs that allow for easy and more accurate diagnoses, improved treatments, drug discovery, customized prosthetics, and more. Over the course of 2019, medical experts are largely inclining toward data sciences to gain better insights from existing health information. Blockchain’s ability to provide data security is expected to drive a push towards more data-driven technologies and that is expected to bring about a real revolution in digital healthcare. As a lookout for what is to come in the future of digital health, the use of AI in improving healthcare robotics is another promising field. Understanding the changing times, MedTech Outlook has compiled a list of Top 10 MedTech Startups to guide organizations associated with the medical sector in harnessing the power of technology to tackle today’s medical challenges, while simultaneously adapting them to improve and broaden healthcare services. With several innovative technological capabilities and success stories up its sleeves, startups like Innovative Sterilization Technologies, Biologica Technologies, OrthoGrid Systems, and more, are constantly proving its mettle in the field of medical technology. We present to you MedTech Outlook’s : “Top 10 MedTech Startups Companies” Abilitech Medical is one of the medical device companies that are launching a product that appears directly out of science fiction at first blush for patients with very restricted mobility, such as those suffering from MS, muscle dystrophy, ALS and spinal cord injury. Their first device increases patient function by helping individuals feed themselves, undertake self-care duties, and literally open doors to fresh possibilities. Significantly less time is spent on the accomplishment of the device to the patient abilitechmedical.com Founded in 2015, Biologica works to find new and better ways to increase efficacy in orthobiologics by leveraging its proprietary technology. The company’s proprietary technology captures naturally occurring growth factors found within allograft tissue, providing patients and surgeons with novel biologic solutions. The company has been developing a number of orthopaedic initiatives involving our proprietary processing methodologies. The company’s primary product, ProteiOS growth factor possesses an array of osteoinductive, chemotactic, angiogenic and mitogenic growth factors that can be added to enhance a surgeon’s scaffold of choice www.biologicatechnologies.com Innovative Sterilization Technologies is at the nexus of disrupting the market with the highly efficient filtered vent sterilization containers — ONE TRAY®. The ultra-efficient technology underpinning ONE TRAY® Sealed Containers has been cleared by the FDA with a 4 minute at 270-degree sterilization cycle with no required dry time and a defined shelf life. IST has partnered with an internal K1 Medical and EZ-TRAX™ to organize OEM knee and hip into ONE TRAY®/EZ-TRAX™ universal sets that consist of just three individual levels of instrumentation, allowing a facility to process two full EZ-TRAX™ sets in one of the washer cycles. This brings in over 80 percent efficiency to the entire process www.iststerilization.com OrthoGrid Systems is a global MedTech leader in the development, innovation, and commercialization of alignment technologies and platforms for orthopedic surgery in North America, Asia, and Europe. Our intelligence-guided systems are designed to work within the surgical theater and interface with existing hospital equipment revealing fluoroscopic distortion and enhancing surgical outcomes by providing greater accuracy and proficiency. Our technology platforms work for all Orthopedic implants in the market, and ultimately prevent re-admissions, reduce hospitals costs, and increase positive patient outcomes orthogrid.com According to medical studies, one in every three women experiences stress urinary incontinence (SUI) — the leakage of urine due to the pressure on the bladder or urethra — at some point in their lives. Minneapolis-based pelvic health company, Pelvital, hopes to help change this. The company is working to commercialize FlyteTM, a simple in-home treatment for SUI, designed to treat weakened pelvic floor muscles and provide a good outcome at a fraction of the cost and risk of surgery. It is the only product to use mechanotherapy to treat pelvic floor muscle (PFM) disorders, the primary underlying cause of female SUI www.pelvital.com ProSomnus enables dentists to create better treatment experiences for people suffering from Obstructive Sleep Apnea and Snoring. The next generation Sleep Apnea Devices utilizes patented, FDA-cleared, advanced technologies making them effective, comfortable, safe and easy. ProSomnus is focused on commercializing device designs that are clinically relevant, and creating treatment experiences that exceed the needs of the practicing sleep dentist and his or her patients. The company is dedicated to further advancing the treatment of Obstructive Sleep Apnea through ongoing research, product development, and process enhancement for improved effectiveness, efficiency, and convenience for patients and doctors alike. prosomnus.com ReGelTec, a pre-clinical medical device company, is developing the next generation of spinal implants for lower back pain and degenerative disc disease. The company’s HYDRAFIL™solution is heated and injected into the nucleus of an intervertebral disc.As the HYDRAFIL™ cools it solidifies as a homogenous solid mass between the vertebrae to eliminate pain. Since the implantis a hydrogelthat has mechanical properties similar to the normal disc tissues,it helps restore the natural biomechanics of the disc and improve spinal alignment. With this commitment to developing innovative healthcare products, ReGelTecwill start its clinical trial in Colombia by early 2020 www.regeltec.com A clinical stage medical technology company, Smartlens has developed a first-of-its-kind disposable, electronics-free and ultra-sensitive soft contact lens that measures eye pressure and its fluctuations throughout the day, giving doctors and patients a better tool for glaucoma diagnosis and management. Smartlens’ non-invasive soft contact lens has a biocompatible sensor that constantly monitors and responds to changes in the eye pressure of the person wearing it. Patients simply have to take a selfie of their eye using their smartphone while wearing the lens. While Smartlens’ revolutionary product is yet to be commercialized, it is already making great strides in tests, providing accurate results www.smartlens.health Alphatec Spine Helps in improving lives by providing innovative spine surgery solutions through relentless pursuit of superior outcomes BrainCo BrainCo strives to apply brain machine interface (BMI) and neurofeedback training to optimize the potential of the human brain. BrainCo was founded in 2015, transforming the most advanced technologies from the Center for Brain Science at Harvard, and Mcgovern Institute for Brain Research at MIT into research and development of wearable wireless EEG brain wave detector. The company has specialization in brain-machine interface technology, focus level enhancement, brainwave detection, analog-digital system, brain science. The products offered by the company are FocusFit, Focus EDU, FocusNow, Focus Developer Kit, and Stem Kit.
https://medium.com/@techmag1/top-medtech-startups-6af2b310d528
['Technology Magazine']
2020-08-20 05:35:50.158000+00:00
['Medtech', 'Technology', 'Medical', 'News', 'Medical Technology']
217
100 Words On….. Waves
Photo by Jeremy Bishop on Unsplash Like ocean waves, cybersecurity threats are unrelenting. Calm seas produce complacency and stormy seas propagate vigilance. How do we handle tsunamis? Recent events like Petya and WannaCry are examples of just such events. While there were warnings, many were caught unprepared and despite the rude awakening, the next waves caught even more off guard. Similar threats exploited similar vulnerabilities. Prevention cannot be our only strategy; we must be able to react and respond to incidents when they occur and recover when the waters have receded. This planning may be your only life raft.
https://medium.com/the-100-words-project/100-words-on-waves-e677f7bad958
['Logan Daley']
2020-12-17 04:49:01.786000+00:00
['Data Security', 'Information Technology', 'Waves', '100 Words Project', 'Cybersecurity']
218
The Augmented Reality Industry is Shooting Itself in the Foot.
That’s a bold statement from a small AR firm in rural Canada — but from our perspective, 2019 and 2020 have been a race to consistently name drop, one-up, and discredit our collective competition with very little evidence to back up the bold claims made by individuals and companies vying for control and clout within the augmented reality space. What’s worse, is the position this places the consumer in; there’s widespread discussion and repetition from industry leaders that says the goal is to ‘democratize the tech’ and make it available for all — yet, very few companies out there seem to make any effort whatsoever to do just that. At this rate, the AR landscape will be reduced to chasing its tail, endlessly circling around the possibility of mass user adoption. Here’s the rub: isn’t that what democratization stands for? We want the global population to accept and integrate AR tech into their daily lives, yet we’re changing and confusing the masses to a state where they want little to do with mainstream AR experiences. So who’s responsible? In short, we are. We’re taking a deep dive into our personal organizational experiences to present what we feel is nothing short of a massive play to undermine what’s possible, what’s ethical, and what’s around the bend. The kicker is, we believe the industry doesn’t stand to gain anything from this play. This should be an era of collaboration, support, experimentation, and sharing. Instead, we’re collectively racing to a finish line that the AR community can’t seem to define for itself. The Big Buy-In For many tech startups, knowledge and innovation are rooted in a strong desire to disrupt and make positive change — sometimes without a clear road map. “Making something a reality sometimes, is challenging when you don’t have the blueprint to go off of, so I feel that these companies were equally inspired by movies and pop culture and had a passion to build that technology,” says KP9 Interactive CEO, Wil McReynolds. Wil has been innovating in the AR space since 2011. That’s long enough to see countless ideas, companies, and personalities come and go, each with their own individual merits and promising ideas. As 2020 approached, a shift happened in the AR sphere. “A lot of them were acquired by bigger companies. At the other end of the spectrum, some of these companies have become too big for their own good. They’ve got to that point where there’s a lot of internal bureaucracy and too many managers. Innovation takes a bit longer because it takes a lot longer to work its way up from R&D. Whereas small, agile companies can really pivot in real-time and make very disruptive things that sometimes end up being acquired by big companies. Sometimes when they’re acquired you don’t see that tech emerge for another couple of years. That sucks too.” Pump the brakes for a second. An acquisition can be phenomenal news for a small company making big innovations. Being acquired by the right group can mean an injection of funds that saves the company, allows it to expand and grow their team or stack, and gifts them time to flesh out the ideas that helped them secure that trust from a big company or VC. An acquisition can also mean support; it can mean access to consultants, experts, marketing help, behind-the-scenes tech that unlocks new doorways, and even the simple psychological boost of being validated by the industry. Make no mistake, acquisition — by the right company — can be a lifesaver. More often than not, it can also have the opposite effect. Take, for example, the acquisition of Metaio by Apple in 2015. Metaio was a growing team of bright engineers and developers that laid the groundwork of what would eventually become Arkit. Full disclosure, Wil was at one time a Preferred Developer with Metaio. At the time of the acquisition, Metaio was creating virtual showrooms for the likes of IKEA and was being adopted by the automotive and industrial sectors in the creation of complex visual repair manuals. Apple sat on that tech for an approximate two years and issued a very dismissive statement deflecting questions, noting “Apple buys smaller technology companies from time to time, and we generally do not comment on our purpose or plans.” Hey, that’s show-biz, baby. But it sure doesn’t advance the technology, or aid in its democratization — Apple took what Metaio had built, a cross-platform solution, and reduced it to iOS functionality, staunchly limiting what was possible. Our colleagues shot us all in the foot. “Metaio was huge, man,” says McReynolds. “They would let you compile to Android and iOS. Apple bought it and they garden-walled it. They made it proprietary on their phone, period… Apple goes on a buying blitz and then brings out its vision of what it should be on top of what the innovators were doing. They’re trying to bring their tried-and-true tropes that they believe are the standard.” With Apple’s big, burly tease of a release for an AR headset slated for sometime in 2021 or 2022, Patently Apple, a website dedicated to closely monitoring Apple’s published patents via the US Patent & Trademark Office suggests we’re being pushed, yet again, towards a tech medium that’s not anywhere near ready for mainstream adoption. Dice Insights writes in a March 2019 article discussing the future of augmented reality and Apple’s pending headset release. “This is not what we want from augmented reality. In fact, this will earn AR a ‘hard pass’ from many consumers… and might hint that the AR future we’ve all been expecting is nowhere close to fruition.” When we take a step back and look at where the industry came from, there were a handful of SDK’s available for developing augmented reality apps. Other pioneering companies, rightfully so, had their own visions of how best to do it. Acquisition accomplished two key things: one, it largely validated the idea that AR tech was a powerful force that held serious potential. Two, it created an arms race that’s as much about marketing and selling the next big thing as it was about pioneering a transformative technology that has the ability to change the world. Industry Thought Leaders & Influencers We believe a key pillar of AR’s failure to build mainstream adaptability lies in the way our industry actually talks about and promotes augmented reality. “A lot of futurists and influencers are people who have in’s in the industry, so they get to see things maybe a year or two years out from the general public even knowing about it in many ways. There are two scenarios, either they’re so hyped on what they see to a point where everything else pales in comparison so they can only go on about that thing — and that’s why they’re always on about the next shiny thing. Two, is the fact many of them don’t have the vision to see all of the pieces and they’re just promoting whatever’s around them to stay relevant without actually understanding the underlying structure of the system. Many of these influencers talking about technology [can’t] talk about the tech — it’s all bling and no delivery.” AR is a form of media; media is not an authentic representation of reality, so all media is a carefully crafted construction. To a degree, when we see the media jump on the bandwagon and preach the gospel of the next shiny thing, what we’re seeing may not be an accurate representation of AR technology. It may very well be a very doctored representation of who’s got the most friends in high places. The marketing sells. The idea is there. The words are punchy, but — and we’ve witnessed this time and time again — the building blocks and underlying foundation of the tech itself are often inflated or misrepresented, and the user experience is easily neglected. Having connections in high places does not make an expert. Having a large audience does not make a thought leader. Having a revered platform publish an opinion does not make an influencer. “Unfortunately, it is who you know in a lot of ways, and some of these people and companies have these brands on board because of historic work — but then again, some of these people at the top probably shouldn’t be at the top anymore,” says McReynolds. “There’s a lot of people that hinder the growth of the technology in many ways because they listen to their friend’s friend who works with an AR company and that person jumps on the bandwagon and writes something because they see an opportunity. That creates its own problems… They’re misleading in their marketing. They’re misleading in what they’re actually selling, and they mislead the user.” Of course, we’re guilty of making blingy tech demonstrations as well. Our beginnings aren’t unlike many other companies in the AR sphere in that we sought to inspire creativity in our audience, our clients, and prospective investors and customers. We stopped doing flashy demos and fixating on the future of AR when we noticed the competition using advanced showings as a tool to sell clients. At the end of the day, our initial demos and the demos of others were the possibilities of tomorrow. Nothing more. Now, to the credit of advanced tech demos: we do see them work, albeit in controlled environments constructed exclusively by the companies promoting them. Yes — they’re awe-inspiring and impressive, but they do not — and cannot — scale. Everyone in this industry is seemingly trying to step on everyone else in an effort to get their product out the door — that’s business, after all. But when good tech and practical ideas are drowned in an ocean of contradictory and confusing content, we all lose. We lose potential sales, we lose audiences, and we lose the ability to hook new users interested in exploring the possibilities of AR. Consider a popular WebAR demo released by L’Oreal that was supposed to allow people to try out “countless” hairstyles and colours by applying a type of filter-based AR overlay. “Countless” was a perplexing choice of wording because the experience only delivered 50 some-odd options. Strategy & Business writer Linda Rodriguez McRobbie says in an article discussing the “long nose” of AR, “I mostly buy into the bright future predicted by experts — but this isn’t the only unsatisfying experience I’ve had… In its current incarnation, AR is, frankly, disappointing. It’s at best a solution to problems that aren’t really problems, and at worst, insufficient in meeting what could be actual needs (like my fringe question). So why am I sure that AR is still going to be the next big thing? Because it will be a really useful idea — when we get it right.” We need creative and inspired people to keep innovating; there is nothing wrong with future conceptual POC’s. Modern movies and games are the primers in many ways and we’re all products of our environments, as it were, but how we talk about and market our collective technology matters. Our media authorities need to better investigate the functionality, operation, and validity of marketing claims prior to giving underperforming and misleading platforms and software engines the spotlight. Unvalidated articles and reviews from thought leaders can potentially create content virality that heavily damages the validity of the industry. They ultimately undermine the efforts of true innovators poised to offer scalable solutions by presenting these flashy demos as market-ready. They’re not. The Whale & Perceived Failure Would investment be awesome? Yes. It definitely wouldn’t suck. But you know what does suck? Having the very industry you work in actively sabotage widespread progress by neglecting the user experience and misleading audiences to the point they feel the tech was all hype, or just doesn’t work at all. Is investment important? Yes, absolutely. Is investment necessary? No, absolutely not. We’ve all seen The Whale. It’s an MR experience created by now for-sale Florida startup, Magic Leap, that showcased an incredible 3D display of a whale breaching a gymnasium floor. Was it sick? Damn right it was. Was it an accurate representation of what Magic Leap was capable of? Many would say unequivocally, no. In a March 2020 article published by Input Magazine, journalist Raymond Wong recounts trying the Magic Leap 1 AR headset after referencing years of hype and excitement, spurred in part by doctored content like the Whale experience. “It was hardly the kind of killer experience that merged virtual objects and physical space that I was expecting from the hyped AR headset. [It] certainly didn’t convince me the $2,300 headset was worth the money, early adopter premium or not. I left the hotel disappointed and didn’t publish any story on it.” Here’s a company that accepted billions in investment from the likes of giants like Google, Disney, and Alibaba, who collectively threw $2.6 billion at the startup with the optimistic goal of commercializing its groundbreaking AR headset. Now, Magic Leap is for sale, a move that Wong says has “rattled some confidence in the AR space.” He’s not alone in that sentiment. Bloomberg published an article in September of 2020, with a subheadline that blatantly stated “the augmented reality startup was undone by profligate spending and its own hype. Investors finally lost patience…” What does this say about the state of the AR industry? What good did that hype train do for the adoption of wearables, or AR tech as a whole? How do we make it as a collective technology when arguably the biggest name in the game tanks and inspires distrust in AR? It says the all-bling-no-system methodology is killing us as innovators in the space, and the more our thought leaders, influencers, and media networks place misleading AR experiences on a pedestal, the more likely we are to struggle — together. The same overarching concept applies to other known industry firms sharing legacy projects. Each are impressive and make waves in their own right, but neglect the importance of building the basement to support the hype. Whether it be falsely touted true webAR capabilities, or neglecting to make experiences accessible to as many users via iOS and Android as possible, these impressive experiences being churned out are wishy-washy at best. They don’t work as advertised, and straight-up don’t work if you don’t have a certain device in your hand. “I know how to use augmented reality,” says McReynolds. “When I can’t make it work, then there’s a real problem for the user and that’s leaving a bad taste in people’s mouths. It’s not the technology — it’s the implementation and delivery… Who suffers? The businesses that want to use [the tech] because when the consumer goes to use it, it just doesn’t work as expected.” Perhaps the proverbial straw that broke the camel’s back and prompted the production of this article, for us anyway, was an industry-leading CEO reaching out — directly — asking for help making their platform work. Let’s face it, if a prominent AR company that sponsors globally-recognized expo’s and speaker series’ is asking the little guy for help, there’s trouble in paradise. If a company is advertising true #WebAR and a tech firm like ours can’t get their experiences to work as advertised on 5 or 6 different testing devices, they need to straight-up get their shit together. “We’re in an era that’s all about investment in tech. Magic Leap set that precedent years ago with the fucking whale video,” laughs McReynolds. “Groups are misleading in their advertising. It’s not accessible on all devices and it’s not true #ARforall, but they’re also thinking so far ahead. If you keep on showing the next big thing, the expectations of the people who want to use it are limited to the higher end devices or those who can afford to invest in them.” How on earth does that reflect #Arforall? When these experiences consistently disappoint and confuse audiences and investors, guess what gets blamed? We’ve boiled down our perceived failures of AR. They stem from misleading people, being complacent, and not having an engine that works. All that failure, at the end of the day, is evidently coming down on augmented reality as a whole. It’s not going to come down on an individual company or the person who coded it. It’s not going to come down on the content creator or the brand that’s marketing it; it’s going to come down on the tech. Influencers, thought leaders, and companies trying to run before they can walk are building a myth that allows the tech to take the fall for what they’re doing. The Solution (from our perspective) Look, we’re not saying the WorldCAST engine we built at KP9 Interactive is the be-all-end-all solution for the collective AR consumer experience. We’re one of many, certainly. But what we’ve done is seemingly the opposite of what some of our competitors have done: we assessed the need for a truly democratized webAR platform with as few friction points as possible and built a web-based studio and portal environment that allows everyone to create, publish, and share augmented reality content. Full stop. Our platform needs no development knowledge. It requires no coding experience. It offers a free option. It does not discriminate between Android and iOS devices. Last but certainly not least, it does exactly what we say it does. We’ve built a platform we feel is right for the current market, but we always have our eyes on the horizon line — we strategically position our company so we can grow and adapt to the world of wearables when they’re inevitably democratized and made available to the public. We think we have a great grasp on what’s coming, what could be, and what could emerge from those changes down the road. What we’re sure of, is that we’re serving what’s realistically available today — not advertising what we want to be available someday. In order to democratize AR, companies must build and sell AR engines that can serve the content that actually works today. How should people access AR? Not through expensive wearables and headsets that promise the moon — not yet, anyway. Instead, what does everyone engaged with modern consumer technologies have access to today? Mobile devices and desktop computers. Thus, create platforms that are accessible via those means and a portal from which to serve AR content. What won’t work is constantly showing off. C’est la vie. The solution is building accessible, useful, practical, and accurate platforms that do what their respective creators say they can. The solution is putting AR in a position where it can provide opportunities to educate, inform, and entertain global audiences, regardless of how much cash they have wrapped up in the latest device. The goal is to inspire, isn’t it? We chose to build the basement first because without a fully functional and ironclad engine, there is no car, there is no stairway to heaven, and there is no ethical chance in hell of mass user adoption by which to seek comprehensive investment. We cannot put the cart before the horse, and we strongly feel the biggest thorn in the side of the augmented reality landscape is our competition and the so-called influencers and thought leaders who do exactly that. Look, we’re not trying to discredit, put down, or ridicule any person, company, or sector of the augmented reality industry. We’re all doing what we can to succeed in an unprecedented time in human history, and there’s a nobility in that which we tip our hat to. The failures of our industry, and there are many, are just part of the process and don’t represent the whisper of an untimely death. An emerging industry worth an estimated $1.5 trillion doesn’t go away overnight thanks to a few shitty demo experiences that didn’t pan out with the public. “You can’t run businesses like you used to,” concludes McReynolds. “You can’t segregate transformative technology. You can’t compartmentalize a form of media, that for the first time in history has access to all information in so many different ways. You can’t take that away from the masses… We’re reaching a digital evolution. We’re reaching a societal evolution… All that stuff has shifted.”
https://medium.com/@kp9/the-augmented-reality-industry-is-shooting-itself-in-the-foot-ef19a1245ff
[]
2021-02-03 14:08:50.715000+00:00
['AR', 'Augmented Reality', 'Technology', 'Webar', 'Mixed Reality']
219
Is your school system data at risk? What you need to know about Cyber Security and Education
Is your school system data at risk? What you need to know about Cyber Security and Education Datalink Networks Sep 3, 2020·4 min read Did you know that between 2018–2019 there has been a dramatic increase of publicly disclosed cyber security incidents of up to 300% ? In fact, according to K-12 Cyber Security Resource Center, the misuse and abuse of school technology and IT systems resulted in 348 publicly disclosed incidents involving 336 educational agencies across 44 states. With virtual learning, schools and districts must now be more vigilant on obtaining the proper cyber security tools and establishing up-to-date protocols and security policies. Why is there an increase in Information Security Incidents? Greater reliance on technology- With remote learning becoming the new normal, the reliance placed on technology has increased among faculty and students. This said, schools should place more of an emphasis on cyber security training to spotlight top threats. Schools are a big target for cyber criminals- With confidential databases of student and faculty information, schools have always been a huge target for cyber criminals. According to the Childrens Internet Protection Act educational institutions must ensure that sensitive data is properly protected to prevent these types of targets. Significant vendor security incidents- It is critically important to have clear cyber security standards for school vendors and other third party practices. As you will read in the next section breaches and unauthorized disclosures account for nearly 60% of K-12 cyber incidents. Greater public awareness and reporting — Teaching students and faculty how to stay safe online has quickly become more than just a compliance requirement. It is critical to create a greater cyber security awareness and reporting program to protect educational institutions against the security threats. The Makeup of K-12 Cyber Incidents Given the lack of attention to cyber security training, lack of awareness from faculty, staff, students can be a major threat to a school’s network. . The types of cyber incidents and threats that can target a K-12 school vary from ordinary spams emails to complex account takeovers . Schools should be aware of the following threats: Breach or Unauthorized Disclosure- In 2019, most publicly disclosed data breaches were caused by parties that were known to and already part of the school community. When it comes to unauthorized disclosure, the top two profiles to watch include (1) current and former K-12 staff (2) Vendors/Partners with access to data and relationship to school district. In 2019, most publicly disclosed data breaches were caused by parties that were known to and already part of the school community. When it comes to unauthorized disclosure, the top two profiles to watch include (1) current and former K-12 staff (2) Vendors/Partners with access to data and relationship to school district. Malware- A form of software specifically designed to cause damage to technical assets or gain access to a remote system. This type of threat is usually distributed through email attachments or URLs leading to malicious content. A form of software specifically designed to cause damage to technical assets or gain access to a remote system. This type of threat is usually distributed through email attachments or URLs leading to malicious content. Ransomware- A form of malware that works to encrypt a users files. Once this is done, hackers then demand a financial ransom from the victim before restoring access to data. If a school system becomes a victim of ransomware, not only will the schools system be shut down for weeks, but the overall cost of data recovery can run into the millions of dollars. A form of malware that works to encrypt a users files. Once this is done, hackers then demand a financial ransom from the victim before restoring access to data. If a school system becomes a victim of ransomware, not only will the schools system be shut down for weeks, but the overall cost of data recovery can run into the millions of dollars. Phishing — A tactic that uses fraudulent emails to gain confidential information such as network credentials and passwords, from users. Phishing cal also include installing malicious software through fraudulent downloads and attachments. Key Takeaways
https://medium.com/@datalinknetworks/is-your-school-system-data-at-risk-what-you-need-to-know-about-cyber-security-and-education-24a9427358b
['Datalink Networks']
2020-09-03 17:07:16.825000+00:00
['Information Technology', 'Information Security', 'Educational Technology', 'Cybersecurity', 'Education']
220
Why music isn’t a top-two category on Patreon (yet)
Cherie Hu: Hey, Wyatt, thanks so much for joining the podcast. Wyatt Jenkins: Thank you for having me. I love talking about this stuff. CH: So first off, given that this a music podcast, I would love to get a sense of music’s footprint on Patreon right now. Because there are, to my knowledge, a lot of artists that are making a substantial living and monetizing a significant portion of their fanbase on Patreon — Amanda Palmer and Ben Folds being two of the most prominent ones — but I feel like it’s still… I think it is gaining a lot of mindshare among artists, but not as much traction as I think it could get. So if we could start there, just to get a sense of what is the current size of music as a category on Patreon — is it growing quickly compared to other categories like podcasts or illustrators, or other types of artists? WJ: Yeah. So music is currently in our top five categories right now … but, honestly, I think it should be number 1 or 2, and we haven’t yet reached that tipping point. And that’s the thing that we’ll probably talk a lot more about today, in terms of what I think it’ll take to reach that tipping point. But it is still a major category for us, and it’s growing really well. Over the past few years, music has grown about 6x in the number of creators, and about 4x in revenue. So we’re getting more musicians joining, and the revenue those musicians are making is growing and growing. [EDITOR’S NOTE: A Patreon rep recently told TechCrunch that Patreon’s overall creator base grew 50 percent year-over-year in 2018, i.e. only 1.5x growth, which suggests that the music category is growing much faster than the platform average.] But there’s just a bunch of blockers for musicians and Patreon at the moment that we’re still working through. It’s going to open up here in the next one to three years, is sort of my prediction … but we’re right at the beginning of that. CH: When you say “blockers” for music growing more on Patreon, what exactly are you referring to? Is it just certain things that musicians would want that aren’t available on Patreon at the moment? WJ: There are a couple of features, but that’s not the main blocker. I think if I had to summarize what the blocker is for musicians: all of their other revenue streams look, sound and act really differently from a membership. They’re coming from gigs, they’re coming from sales of music, touring — you know, all these other lines of revenue for musicians are these spiky, hustle-based lines of business. And Patreon’s over here saying: Hey, you can make a six-figure income by just having a close relationship with your fans, and delivering unique value to them in some form, whether it’s just them getting some backstage access to you, or them getting behind the scenes in how you made a song, or them getting some merch — you can create this other type of business. Most musicians who I talk to, it’s very rarely a feature. It’s very rarely like, “oh, if Patreon did x, then I would join.” It’s usually like, “oh, wait, how would that work? Can I monetize my brand deals on Patreon?” And it’s like, no, no, no, that’s different. They try to take the model that they’re currently using, and put it on Patreon. And we have to say — so what Patreon’s job is, is to define membership for the whole world. That’s what we’re working on right now: we’re defining, what is a membership? And why do artists give a shit about it? I think that’s the biggest hurdle for us. Secondarily, Patreon’s roots were that of crowdfunding. The origins of the company were very much like, “hey, I have a thing I make, and I would like funding for that thing.” And that doesn’t quite sit right with musicians … Musicians are saying, “I am a musician for life, and I don’t need funding for that — I make a thing of value, and you should become a member if you really care a lot about the thing I make of value.” That’s more the branding issue I was referring to. In the early days, there were some big musicians like Amanda Palmer who said, “this is a great model, I’m going to take it on” — so you have all those early adopters, and people that are really forward-thinking who say, you know what, this is a great model, I’m going to do it. But then you have a bunch of folks who are, like, “oh, if I do that, does that degrade the relationship with my fans? Does it make it seem like I’m asking for money?” Those are the brand issues, and we’re now in year two of a five-year process of being a world-class membership product, where — no, you’re not asking for anything, you create something of value, and your fans love getting backstage access to that thing you create. So in that sense, for musicians, it’s a next-gen fan club, you know? That’s the way we think about Patreon. CH: Maybe there’s also, in this current day and age where streaming dominates, a pressure to scale — [such] that just by nature of what a membership is, it’s not as appealing to some artists. There’s a lot of pressure to get on this one playlist, or get this one brand partnership or sync that will reach the widest audience. This is something that I’m experiencing first-hand: so, I have a Patreon, I’m really enjoying it and it’s going well, but it’s also just over 150 people supporting right now. [EDITOR’S NOTE: This number has since increased to around 170! :) ] I can just imagine, if you’re bringing that to an artist, saying: you can have 150 people, or a couple hundred people, allowing you to make close to a full-time living on Patreon, or you can get put on this playlist that has five million followers. That gap is so big, and I think now in terms of the kind of mass-market idea of what success is in the music industry, there still is much stronger of a penchant towards scale. WJ: Well, I mean, gosh, we are all wired for massive reach unfortunately, because of the vehicles that have driven monetization of music, really for 30 years. Right? I mean, major labels were the culprit in the ’80s and ’90s who would sign a musician, and they only care about selling as many records as possible. So major labels would push artists to do weird things like make a terrible album because there’s two good songs on it — because the major label’s pressing for the packaging of the good that they’re going to sell the most units of. Basically, that model started to break down, and we moved into this Internet world — and now we’ve just swapped it out for another thing, right? Now we have, “oh, who’s got the most eyeballs?” Because Google and Facebook, they care about ad revenue. And Spotify. So if you can drive listens or eyeballs with the thing you do, you will get fractions of a penny on the ad revenue. And again, these are just giant machines of scale that are trying to just do what they do best, which is make a lot of money off the most volume of eyeballs or listens. And again, we’re just back in a really fucked up situation, like we were in the ’80s and ’90s with major labels, only now it’s with major tech platforms. CH: Kind of related to this, in terms of the role of tech platforms: I saw this blog post that you’d written back in February on Patreon’s website that I thought was so fascinating, and it was about how you argue that Patreon is not a discovery platform. And you’re making the distinction between discovery and membership. I’ll link to this in the show notes for those who are listening, but in case you haven’t seen it, Wyatt is addressing this question that you’ve gotten all the time of, “why doesn’t Patreon help me get new fans, or grow my audience, or feature me on their homepage?” — questions that I feel like, in the context of music-streaming, artists and managers are jumping on all the time when they’re talking to the Spotifys and Apple Musics of the world. Right? Like, “how do you feature me on your homepage?” There’s one specific paragraph that I want to read out loud, now, that I thought was particularly striking to me. So you wrote: “If we were to become a discovery platform [we being Patreon], a platform where users go to browse new content, that would put Patreon in between creators and fans. Think about YouTube, Facebook, and Instagram. These sites are great discovery platforms, because they have all the eyeballs. [Speaking of eyeballs.] Their goal is to acquire and retain all the eyeballs. They love rich algorithms, designed to keep the content you will click on. Through their algorithms, they put themselves in between the viewers and creators by deciding which content to display.” That just stood out to me because there is this gradual, ongoing reckoning with the fact that major streaming services like Spotify and Apple Music are gatekeepers — or they limit the extent to which artists can truly be direct-to-fan or direct-to-listener in the way that they can be on a site like Patreon, because Spotify and the like are very motivated by discovery, and so they’re always trying to feed new content to listeners, and so they control the pipes of how content travels in their platform. I have a lot of questions related to this concept, but the first one is — so it seems that, based on this blog post, you don’t think the fact that Patreon is not a discovery platform is a disadvantage. I feel like for a lot of founders who are building start-ups related to helping creators or helping people create content, discovery and editorial are a necessary component of a “successful” company. It’s almost table stakes for even competing in this realm. But based on this blog post, you’re saying, no, that’s not what Patreon is about. We just want to take a backseat, and help creators connect with their supporters as effectively and as efficiently as possible without anything going in between. I would just love to get your thoughts on that. WJ: Whoa, there’s a lot there. CH: There is a lot. [laughs] WJ: Let’s look at a bunch of industries. I think there are examples of other industries where people aren’t focused on discovery and they’ve grown very large businesses. I’d say Shopify is a really good example of that. When you buy a T-shirt from someone’s Shopify store, Shopify doesn’t tell you to go buy a T-shirt at some other Shopify T-shirt store. In a world of discovery platforms, Shopify’s strategy is: We will be the SaaS [software-as-a-service] platform of all these stores, where the store owners have complete control over their business. So basically, we believe our core value and one of our real value propositions to musicians is that our product will strengthen the connection between you and your biggest fans. We will not degrade the connection between you and your biggest fans. And what I would argue, and what I’m saying in this piece, is that discovery platforms by their nature encourage the end user, the consumers, to keep looking at stuff or keep listening to stuff. That is the design; that’s the mousetrap. Just keep clicking, keep getting notifications, keep doing stuff — and by that design, it will always prioritize the engagement of the consumer over the needs of the person making the thing. That’s just the way they’re built. In a world of discovery platforms, we think the pendulum’s swinging the other direction for the next ten or fifteen years, and a platform that strengthens the connections between you and your fans is going to actually be a long-term, viable, high-growth business. That’s the pitch, if I were to be talking to, like, a VC [venture capitalist] in Silicon Valley: Hey, this is not a discovery-platform mousetrap that you’re used to seeing in 20 other businesses that you invest in. This is quite the opposite. It’s a platform of lots of little memberships, and what our product does is strengthen the relationship of our memberships. CH: Maybe that is an element of the branding or messaging challenge to artists, as well. I can imagine an emerging artist with maybe 500 listeners on Spotify — it’s a number, but it’s not that much — I can imagine an artist with that followership going to Patreon, expecting [Patreon] to be kind of a driving force in increasing listenership on streaming, because that is their priority. And so you’re saying that that is not Patreon’s role at all. It’s about strengthening those bonds that already do exist that artists and other creators have already built up elsewhere. Is that correct? WJ: That’s correct. Like, I’m a fan of Ben Folds, and Ben Folds is on Patreon. And he dictates the design of that membership. He’s got things that he does for his fans: one thing that he does is livestream him[self] drinking and smoking a cigar, and listening to vinyl in his house. He does that, like, once a month or whatever, and occasionally he does a post, like, “oh, I’m at this tour, I’m at wherever.” He’s defining what the membership looks like that connects him and his best fans. It’s not, “I have to go do a post every day at five o’clock to continue to rank high in the algorithm.” It’s: “Hey fans, this is the stuff I’m going to do in my private membership. Is that interesting to you? And if so, join!” And he’s had tons of success with that, and I see a bunch of other musicians do that, too. But again, it’s just a fundamentally different model than all the other things that exist out there, and I think that’s why we have this challenge of defining membership. The other big piece that’s challenging is a lot of people think a subscription and a membership are the same thing. And they’re really pretty different. I have a subscription to Netflix. I pay for it every month, and I get content, and I love that content. It’s really good. But I don’t feel like I’m a member of Netflix. I don’t go to a hangout and talk to people about my Netflix subscription, or do live Q&As with the head of product or something like that at Netflix. I subscribe to content on Netflix. So what Patreon is is a membership platform. And what that means as a patron, as someone who is a fan of a musician, is you are a part of that tribe. You have an inside view, you get the merch first, you get to understand what songs are coming out later, you can have creative input — those are all the kinds of things that happen in a membership that’s fundamentally different than a subscription. Back to what we’re trying to define here: we are defining membership on the web as a way to deliver really unique value to fans. CH: Just playing devil’s advocate — one thing that I can see a musician saying is, “Oh, I’m already giving all of these kinds of benefits on platforms like Instagram and Snap already.” And obviously the big disadvantage of relying on those platforms is that you’re not getting paid for it, most of the time — like you’re not getting paid, or people aren’t supporting you, in exchange for an Instagram Live video the same way that they would on Patreon. I’m just wondering if you could elaborate on that, in terms of the types of benefits that artists are giving their fans, or the way they’re interacting with them, that has not yet been covered or is not yet as effective in the myriad of other platforms that a lot of artists feel pressured to pay attention to, like Instagram and Snap. WJ: I’ll give you an example. I have a DJ friend of mine, who has not yet joined Patreon. She’s going to. [laughs] Or I’m going to die trying. Her name’s Honey Dijon. Do you know Honey? CH: I do, yeah. WJ: I DJ’d with her 25 years ago, back in Chicago, so we’ve known each other a very, very long time. And on Instagram, she has such a presence. Such a power. She has something to say. She’s a black trans woman, she’s deep in fashion, she’s always touring the world, she’s just such a force. I’ve known her for a while as a human, but also I’m just a fan, I guess, when I follow her on Instagram. And when I talk to her personally, I say, “gosh, Honey, you have such a force and the people that follow you, they really are interested in the things you have to say and all this nuance. Why do you give that away?” And she doesn’t have a very good answer. You know? It’s, like, “oh, yeah, gosh.” And I hear this from a lot of musicians. They’re just doing it for eyeballs, because they think that funds the other things that they do. Like, oh, if I have a lot of followers, that means I’ll get brand deals and that means I’ll get tour … so this is, like, a means to an end. But it doesn’t have to be. I guess that’s the way I would talk to any musician. You don’t have to give all that away to the ad platforms. You can do your Instagram thing twice a week, and then take some of your really special stuff back into your membership. You can still get that kind of top-of-funnel eyeballs, but you can then make a wonderful membership with your top fans. It’s not one or the other; it’s just the one with membership actually gets you paid a lot in a very predictable way. So that’s what I would say to any musician who’s got a strong following on one of those social platforms. CH: So on the flip side of that, something that I’ve also experienced firsthand that I know a lot of artists also experience is expectations versus reality of what it takes to run a successful membership program. Like you’re saying, on one hand, as an artist you’re not obligated to give everything away for free on these ad platforms. On the other hand, in the context of Patreon, if you’re just giving stuff away, and that’s it, maybe that can be effective but I feel like there’s a lot more to what makes a really memorable or successful membership program or initiative for an artist. I’m wondering if you could talk about what some of those elements might be, in terms of what people think goes into running a successful Patreon, versus what the actual reality is. WJ: Yeah, I think this is a critical facet of why creators do or do not join Patreon, especially musicians. Because a lot of times, folks from the outside look at it and they’re like, “oh, it seems like a lot of work.” I’ve heard that a bunch. It’s interesting, because it’s whatever you make it. You can literally just take the things that you currently do, and make those into the benefits of your membership, so that you do almost zero new work. That’s a path I’ve seen a lot of successful creators take: okay, I’m going post about some topic once a week, I’m going write a song once a week — whatever the things they do, now I’m just going put those in different membership tiers. Because the whole point of Patreon is for musicians to continue to be creative, and to do the things they love. It’s not for you to take a second job. [laughs] You know what I mean? It was designed by Jack Conte, a YouTuber and musician, and now myself, a former, DJ and electronic musician — so it wasn’t built to create an extra layer of work for someone to do. It was built for someone to do the things they were going to do anyway creatively, and just have a great relationship with their strongest fans about that. But let’s talk about some real benefits that people deliver, just because they’re fun. So, I always like to say that there’s unique tangible or intangible benefits that musicians or creators can deliver to their fans. And honestly, the tangible ones are pretty straightforward, and it’s actually the intangible ones that I often find to be the ones the fans go the craziest for. Tangible benefits are exclusive content … [like] early access to a new song I’m writing, or a half-written thing that I want feedback on. Also, merch. We have a new merch product that helps you deliver merch to people on your membership. It’s cool because we’re not trying to be a merch store — Patreon does not have a storefront where anybody can walk up and buy merchandise — but we’ve built a merch product that helps someone deliver unique merch to somebody within a membership. Maybe after three months or after six months, you get the cool, purple hoodie if you’ve been on Patreon for a year that only a super fan can get — that’s a type of merch. So merch and exclusive content are examples of tangible benefits you can have in your membership, and we’ve built products and services that help you do that stuff really easy … Those are straightforward. Now, there’s all kinds of what I would call “intangible benefits.” One intangible benefit is recognition. It’s amazing how far a little bit of recognition goes to making fans feel really special. You can imagine a musician saying, “Hey, thank you to so-and-so who’s been a fan for over a year on my membership,” just on Twitter or something, or in a public place, and putting their name on that — people love that. A lot of musicians do name-in-credits, they’ll be, like, hey, these are my patrons that support me and help me do this album and they do a whole page, with patrons who supported them, and then we also have communities we sync up with where you can get a little badge for being a patron for a period of time. Those are all what I could call “recognition-based” benefits. And people love them, and they’re pretty low output from the creator. The musician doesn’t have to do a lot of work; you just log in once or twice a month, and you give some shout-outs to folks who’ve been longtime fans. Another type of [intangible] benefit is involvement. A lot of fans just want to see how the sausage gets made, and they would love to give feedback, even if the artist doesn’t do it. I see musicians ask about half-finished songs: “Hey, what do you think about this direction I’m taking this?” I see authors ask their fans, “Hey, which direction should I take this character next season?” And there are polls, and things like that, where creators can ask their fans to take a quick survey so they can get some direction from them. These are all the things that are kind of creative involvement. Another type of benefit is a gated community. So, a lot of creators use Discourse or Discord or one of these community products we have; we also have a Reddit integration, and they basically create a community that you can only be a part of if you’re a member. And in that community, you have people talking back and forth about insider information, and usually there’s a community moderator who’s been a longtime fan of that artist or musician. The last type of benefit I’ll talk about is access to the creator. A lot of creators will do a monthly video chat with a small crew of their patrons, or they’ll do group chats in Slack or something. A lot of musicians do patron-only shows; they’ll go to a town, they’ll figure out who the patrons are in that little town, and that musician will go to a bar that holds 50 or 60 people and it’ll be a patron-only show. These are all forms of access that someone couldn’t get if they weren’t part of a membership. With all these, you can kind of see how there’s ways to do them that are very lightweight for the musician — you could just do one or two touchpoints a month, and keep your fanbase super engaged and interested. Okay, I talked for a long time there. Does that all make sense? CH: Yes, it does. And I’m really glad to hear you say all of that because those are all things that I’m trying out in some way with my own page, and it’s been super interesting to try to navigate tangible versus intangible benefits. I actually initially started my Patreon to focus on intangible benefits. Historically, the value of those types of benefits have been really hard to measure. Maybe with the help of a membership platform like Patreon, it is easier to measure, because now you’re able to quantify how many people would be willing to pay for access to your songwriting process, if you’re an artist — or, in my case, the process of putting together an article. I don’t know if that’s something you’ve thought about or tried to productize at all, in terms of measuring the monetary value of intangible benefits. WJ: Yeah. We are currently in the process of ensuring that all of the benefits on the entire Patreon platform are structured. In the early days, creators would just sort of add whatever blob of text into Patreon, around their benefits, and currently we’re moving towards a model where 95% of benefits will fall into some type of taxonomy. This isn’t like a rocketship future, this is in the next six to 12 months. Any new creator that signs up [can see]: hey, these are the benefits that systematically work really well for creators like you. Oh, cool, you’re a rock musician? Here are the benefits other rock musicians have a lot of success with, that drive patron engagement and retention. You can do whatever you want, but here’s our recommendation. When you think about the product, that’s the core data that we sit on that’s really exciting and interesting: we know, across thousands and thousands of creators, which are the best benefits and why they work. CH: And I guess that enables a very different kind of discovery, in terms of what you’re recommending to a new musician of a certain style who may be looking for recommendations on what to do. It’s like discovery in the sense of just how to run a successful and effective business. WJ: Correct, that’s it. And, you know, we’re a platform, so as these benefits get more and more structured, more and more third parties will attach to our platform. Most ad platforms at this point are closed; they don’t play well with others, because they’re trying to keep the traffic on their site. Facebook and Google are trying to keep you in their world, and so they don’t necessarily have a lot of partnerships. Whereas we’re sort of an open door and we have a platform strategy, where we want to have as many partners as possible. For example, Bonjoro is a partner, and what Bonjoro does is it just links up to our API and it says, hey, these are your fans that have crossed the one-month threshold or the three-month threshold, and you can do a quick video thank you to them. And when fans get a personal “thank you” from a creator that’s in a video delivered right to their phone, they love it, and they’ll be a fan for years. It just takes those nudges and those little thoughtful moments that strengthen the connection between fans and creators, that we’re designed to do really, really well. CH: Great. And one more higher-level organization-related question comes to mind, going back to what you wrote about. I’m wondering if you think there’s ever a situation where discovery- and membership-oriented mechanisms can coexist and be baked into the same product. One example that a lot of people have speculated about concerns PledgeMusic, which is going through all sorts of problems right now in terms of its future. For those who aren’t familiar, PledgeMusic was a crowdfunding and direct-to-fan sales and engagement platform for artists, but they failed to pay a lot of artists the money that they raised directly from fans on time, because they were abusing and pulling those funds back into running their own business, and there was a whole controversy around that. [EDITOR’S NOTE: PledgeMusic’s website went offline in late July amidst bankruptcy proceedings in the U.K.] And so their future is uncertain, and a lot of people have suggested that companies like Soundcloud or Spotify acquire PledgeMusic. Their logic behind that was Soundcloud and especially Spotify are arguably not fan-first platforms. They’re discovery platforms. They’re promoting this model of all-you-can-eat, we have millions of tracks and we want you to keep discovering new music, rather than necessarily trying to strengthen bonds between artists and fans. And so by owning a company like Pledge, you can start to really make a meaningful dent in that space and do things like improve the merch product on your platform … which right now is relegated all the way to the bottom of the [artist’s] profile page [on Spotify]. I’m wondering, at a higher level: Do you think that’s possible, in terms of a more discovery- and eyeballs-oriented platform running a successful membership business within the same product? Or do you think that fundamentally they have to be different? WJ: Yeah, I mean … I have a weird background: I was a musician for 15 years or so, and then I was one of the founding team of a company called Beatport [in 2002, and since Beatport I’ve been head of product at a number of organizations, [including] Shutterstock, Optimizely, Hired and now Patreon. And so I have this really weird mix of having a really deep connection with being a musician, and having seen multiple SaaS business models, marketplaces and discovery platforms. I’ve worked at all of them at this point. When you think about companies and how they operate — big financial organizations — really all you have to do to understand their priorities is follow the money. It’s very simple math. What we’re doing, what Patreon’s doing, to answer your question directly, is prioritizing the needs of musicians and creators in general. When you log into this product and you use it, it’s designed for the creator first. And our business model: we take a percentage of creators’ revenue, which means that we’re only successful when the creator is successful. So if you, Cherie, make zero dollars, we’re going to take between five and 12% of zero dollars. If we were to ever go public, or if you were to look at Patreon in five years, you would know what our priorities are. You’d be able to go, well, their first priority is creators making money. Like, they care a lot about that. Any time you do “discovery,” you just have to know where the priorities lie. Like, YouTube’s not a bad company — those are good people. And they care about creators. I believe that. But if you look at their financial model and their business, you realize very quickly that creators are the third priority. The number one priority is advertisers. Period. And the number two priority is end users, because they need the end users to get the advertisers. You can tell by demonetization that when the shit hits the fan, they got to change the algorithms, because advertisers are the ones paying. And then the next most important thing is to make sure people stay watching video. And then the third most important thing is to make sure that people make stuff for people to watch. So if you just look at the core business model of a discovery platform, the people who make stuff are the third priority. Spotify cares most about end users listening and advertisers selling. Oh, and retaining on subscriptions. Right? But all those things are before musicians monetizing. And that’s why Daniel [Ek] couldn’t go out and say, “we spent the first ten years on being a great discovery platform and now we’re going to switch to creators.” Really? Your entire business model is predicated on subscriptions and ads. You know what I mean? It’s sort of like Facebook: “For the first ten years, we focused on ads. Now we’re going to change everything and turn it upside down.” I don’t think you actually are, because you have to answer to shareholders, and shareholders want to see the current business grow. [Mark] Zuck[erberg] would have to say, “Hey everybody, actually, hold on a second. Don’t worry about this whole ad business that drives $50 billion a year. We’re going to do this other thing. It’s going to be great. Now, it’s going to take about ten years for it to grow, but trust us.” [laughs] Just follow the money, and you’ll understand the priorities of organizations. Patreon is making this huge bet — it’s a crazy bet, honestly, but I deeply believe in it — that we can be creator-first and grow a real business. CH: This is the last question before the final segment, but I did want to talk about this element of being creator-first — specifically in the context of building a SaaS that appeals to them. In a previous interview with TechCrunch — they published a really fascinating deep-dive on Patreon and where they’re going and how they got here — you’re quoted as saying that Patreon’s goals are to build “the world’s best membership SaaS product for creators.” This is something that I’ve also noticed a lot of other music companies are trying to compete on. I wrote recently about Stem, which is a music distribution platform that recently pivoted from more or less fully long-tail, to essentially trying to build effective business and accounting software for a select group of independent artists and their teams. And so this is, I think, a high growth area for a lot of companies now. I would love if you could share what you think is missing right now from this landscape, in terms of membership and business-management SaaS for artists — specifically for musicians — and how Patreon specifically might fill that gap. WJ: Yeah, Stem in particular we are currently not at all competitive with. A lot of the SaaS products we’re seeing out there are kind of this model of how to do rights management and how to pay people in your team. We will become competitive someday, and the reason why is because an obvious future for Patreon is to move further into the back office. Because what Patreon represents for musicians looks and smells like a steady paycheck. You get this monthly, recurring thing — it varies a little month to month, maybe you got a few extra fans this month, maybe you lost a few more this month — but it starts to resemble this really steady predictable income, and if you have the predictable income there on Patreon, there’s so much more we can do for musicians once that’s solidified. That’s pretty much the future roadmap. So you can imagine the world in five or ten years — not even five or ten years — where Patreon gets into capital, for example, because we have a really predictable income stream. We could go to a bank, and say: I can tell you what this musician’s going to make for the next three years, within 5% error, because it’s very steady and predictable. So we can get into loans, we can get into capital, kind of the way Square Capital does. So we will move further into the back office in the future, because of the way that the revenue component works for creators … If creators have a big project they want to do, there’s no reason we can’t be the source for them to loan the money against their revenue stream over longer periods of time. I do imagine us getting somewhat competitive with folks, but most of the SaaS products I’ve seen out there are starting in the purely back office, where it’s like accounting and paying your team — whereas Patreon is strengthening the fans and creators, so the way you post, the way you deliver benefits and the way you run a membership. So longer-term, I think we’ll be competitive; for the next three years, I don’t see that. Did that answer your question? CH: Yeah, totally. I think it’s definitely a compelling proposition given that even today, it’s still really difficult for an artist to get a clear picture of their finances, generally. And this is a lot of what, as you’re saying, a lot of other music-oriented SaaS products are working on. But starting with a point of predictability that just happens to not be tied to streaming royalties and then building up from there — that’s a super interesting thought. WJ: I mean, Jack [Conte] and I have the same story, but a decade apart … I was a DJ, and I had a good year sometime in the late ’90s, I think I made over $100,000 — which, for a DJ playing vinyl at nightclubs and raves, was kind of a lot. It felt like a lot. I was living in Denver, Colorado at the time, and that year I wanted to buy a home. And I was, like, I definitely had more than enough for the down payment, and I had been a successful DJ for almost a decade at that point — but I couldn’t get a loan. The bank wouldn’t loan me money as an artist because I didn’t have pay stubs. I had made $100,000 that year off of my various revenue streams from touring and selling music, and someone who worked at the post office and made $30,000 a year could get a loan from the bank, because it was predictable, and I couldn’t. And that was, as an artist, this moment for me where I was, like, “oh shit, I’m not in the system.” [laughs] I’m this weird thing that the regular financial world doesn’t recognize as legitimate. And the future of Patreon, of course, is legitimizing artists in a way that’s super compelling. So that’s something that I’m personally passionate about, because I never for a minute felt like I was a big risk for a bank. I think that was, what, 1998 or 1999 — so twenty years later, here we are, solving that problem. CH: Yeah, that’s amazing. In the interest of time, I’d love to transition to the last segment of over- and underrated music news. I’d love for you to start because I know you have a super interesting and topical one that you want to bring up. WJ: Yeah, I guess for me … every single podcast app on the planet right now is building a gated subscription, and I just think this is really overrated. Like, do we all need seven podcast players? It’s just not going to happen. I literally see a new one every week, and some of them are like subscription[-based], and some are sort of membership-y, and some of them are sorta social-y — but honestly, consumers are simply not going to use six or seven podcast apps. There’s no chance. [EDITOR’S NOTE: Examples of these paid podcast platforms include Luminary, Brew and Slate’s Supporting Cast.] This feels like a really hot trend right now, and I’d like to think Patreon has something to do with that, because we’re decently successful and membership is getting out there, and I think we’re seeing power move back to the content creators. So those trends are all true. But what I don’t think is going to happen, the overrated part, is I don’t think consumers are going to start to use a bunch of different apps to start to listen to podcasts. They’re going to want that in one or two spots. CH: In that sense, it really strongly parallels the TV and film space. This is the same problem that TV and over-the-top, on-demand video streaming services have faced for several years. I think in TV and film, maybe because they’ve been around for longer, there are people who pay for multiple [services] — like, who will pay for Netflix, but also Hulu and HBO. But I also don’t know people who are paying for six or more different services. And maybe what these podcast apps are trying to do is take a similar path, saying, “oh, we can create premium content just like HBO does for video; we’re going adopt a similar business model to an HBO or Netflix.” But yeah, just by the nature of the [podcast] format and its history, and how it’s been made, historically, very widely accessible, [and] the process of creating it is very democratized — yeah, I agree, it doesn’t really align with subscription that well. WJ: It’ll be fun to watch it all play out, because there’ll be a lot of carcasses over the next 18 months. And there will be a few winners who manage to get traction because they’re able to get enough consumers to drive it. But, boy, it’s sort of like more discovery products in a world of discovery products. CH: Exactly, yeah. To bring that full-circle. [laughs] So the piece [of news] that I had in mind was pretty recent. [EDITOR’S NOTE: this was from late June 2019, which can still be considered recent-ish.] It’s this news that the Jonas Brothers, who just released a new album, also now have their own vinyl membership club. And I am visualizing “club” in massive air-quotes, because it’s just, like, a vinyl bundle. The way that they’ve priced it is super premium. The lowest price is $399. WJ: Whoa. CH: Oh yeah. And in exchange for that, you get eight different vinyl records, so there is a lot of catalog involved. [You get] ten different singles, you get a customized slipmat for your turntable if you own one, and you also get a bunch of posters. The only other tier above that is $599, and that includes a bunch of other exclusive content, I think. It’s not really a club, because I actually don’t think there’s any community aspect to it. And they’re marketing it as a subscription, but the only part of it that’s a subscription is the fact that you could opt in to pay in four different segments of payments, instead of just paying it all upfront. But that’s not a subscription. That’s just, like, a payment cycle that’s more spread out. In general, I think it’s overrated because — I will admit I listened to the Jonas Brothers a bunch when they were first starting out, but I’m not the kind of person to be buying a turntable and listening to vinyl records personally. And not to generalize my very specific, anecdotal experience, but I doubt that the current Jonas Brothers fanbase is willing to pay that high of a premium for catalog in a streaming age. Especially because the material is also not coming immediately; it’s going to be spread out over a much longer period of time. So, yeah, I’m not super convinced. [EDITOR’S NOTE: There is now a wait list, according to the website, so people did buy the bundles after all.] To me, it’s almost like an overpriced VIP experience at a live show. I personally have some qualms about that — like, paying several hundred dollars just to get a handshake and a photo with someone like the Jonas Brothers — but, yeah. Those are my two cents. WJ: I mean, it sounds like a record label tried to do a subscription. CH: Yes. WJ: Because it’s not really ongoing value … it’s sort of like a lumpy bit of value smoothed out over six to eight months. If you’re paying $399 a month, does it have a cap? Does it end? Or is it just forever? CH: It’s not $399 a month, it’s just a one-time fee. WJ: Oh. It’s a one-time fee, that you can pay in increments. CH: Yes. So it’s not even a subscription, is what I’m trying to get at. WJ: Yeah, that’s a really interesting model. I’m sure they’ll get a few takers. That’s cool. But the takers of such a thing, I think, are a better fit for a true membership. Like, if you’re a person willing to spend 600 bucks for that stuff, you’re probably really interested in the Jonas Brothers. [laughs] CH: That’s so true — and from the label or artist’s perspective, you should be trying to engage them more and capturing more of that value more regularly. I totally agree. WJ: Yeah, make it 20 bucks a month for, like, years. That’s a really highly-engaged segment of your audience that you’ll probably want to be close to. You know, one of the reasons Patreon doesn’t always work, by the way, is that … if you want Patreon to work, you have to like your top fans. You have to, like, want to ping them every once in a while, send them a post, say hello, jump into a chat room. And you know, a lot — not a lot, but a percentage of artists, I’d say less than a quarter, really don’t want to engage with their fans. [laughs] Like, I’m just thinking about this Jonas Brothers thing: it’s not like they are engaging with their fans. It’s just some stuff somebody made for their fans. Like, if I’m really into the Jonas Brothers, it’s not like, “Oh, cool, I get to know more about the Jonas Brothers.” No, you just get some stuff. CH: Right. Just objects. Speaking of more intangible benefits that aren’t tied to the artists, it’s not like you’re getting to, like, talk with any of the members of that group. WJ: Yeah. Sometimes I’ll meet a creator or musician who’s very popular, and then I’ll ask them about their fan relationship. I’m like, how often do you chat with your fans? And occasionally I meet someone and I’m like, “oh, you shouldn’t do a membership.” Like, actually, this isn’t for you. Because if you’re, like, disgusted by them, or you don’t want to talk to them, like, that’s probably not going to work. You have to really like your fans. [laughs] CH: Yeah, and it’s good that you say that, because membership is definitely not a one-size-fits-all [model] — just like streaming, arguably, is not a one-size-fits-all [model] for every artist. It all depends on what you prefer to do, what your existing relationship [with your fans] is like and what your goals are. All of that. WJ: Yeah. Streaming is awesome for the .01% who get massive plays … tt’s just not very good for 99% of musicians. I think membership, when I think about the market size, is a really good business model for, like, 70% of musicians. And maybe not a great business model for the top 1%, unless they like that fan relationship … because if you’re, like, Taylor Swift, your top fans can be kind of insane, right? You may not actually want to hang out with them. [laughs] I don’t know. CH: Yeah. I think Taylor Swift had her own social app a while ago that ended up not getting that much traction, for multiple reasons that we just touched upon now. One, she just posted, like, a couple of selfie videos and that was it, so engagement definitely went down after a while from her. Two, there were a couple fans who were hyper-engaged, but then also there was a point where some fans were hostile to other fans, and I guess Taylor’s team either didn’t care or they just didn’t know how to handle that. Those are things that, as [you’re] running a membership platform, you will have to grapple with in terms of very hands-on customer service. WJ: Totally. And, like, Amanda Palmer makes it work really well, because of Hayley [Rosenblum, Amanda Palmer’s manager]. She has a person who manages that membership incredibly well, and she’s got a really high-volume, large membership, [such] that it works wonderfully … Hayley, like, runs that thing, and Amanda swoops in to do really cool stuff with her fans, and Hayley makes sure Amanda comes in at the right points. But yeah, the bigger the artist you are, the more careful consideration you need to have about the level of engagement with your fans, and when you as an artist need to actually go in and engage. CH: Yeah, absolutely. Thank you so much for joining and for chatting, this was super interesting. WJ: Oh, I love talking about it. And as a former musician, I would love for music to be number one or number two on Patreon. Video and podcasting are just running away right now, they’re growing so fast — but music is growing fast too, and I do think there’s going to be a tipping point in the next couple years because it’s such a good model for most musicians. As a DJ in the ’90s, I had a thousand true fans, easily. Probably 5,000 true fans. Any time I would play an event or a club, I would have twenty people around the booth who knew every song I’d ever made and came out to see me. Any time you’re at a performance at a venue, people have varying degrees of connection to your history as an artist. So if I’m at a venue with 1,000 people, maybe 100 people know exactly who I am and they’re here for that and they’re like, “Wyatt, are you going to play that one song you made a year ago?” Those are my superfans, and they would love to have a deep connection with me as an artist on a platform like Patreon, and that would’ve been really nice back then. And I’d say for a lot of artists, it’s the same way. You know you have that level of fandom, and so this is just an opportunity for you as a musician to monetize them better. CH: And to have an easier time just interacting with them in a more central place. WJ: Yeah. Something that strengthens the connection and doesn’t get in between it. CH: Yeah, exactly. Awesome. Thank you again, appreciate your time. WJ: Thank you so much! Have a great day.
https://medium.com/@cheriehu42/why-music-isnt-a-top-two-category-on-patreon-yet-c734a71d8959
['Cherie Hu']
2019-08-15 03:25:43.876000+00:00
['Business', 'Startup', 'Music', 'Crowdfunding', 'Technology']
221
My git-icious journey at IMG
It has been almost a year since I became a member of IMG as a developer. Looking back, I feel that a major part of my journey was quite analogous to the most used git commands. In this blog, I will take you through my awesome journey at IMG and how I draw parallels to the most frequently used git commands. Grab a cup of tea/coffee because we have a long journey to cover. Git Before beginning, let's have a basic idea about git. Git is a free and open-source distributed version control system. Its main purpose is to keep track of projects and files as they change over time with the changes happening from different users. Git stores information about the project’s progress on a repository. Let’s begin git init This is the most crucial command. In terms of its usage in git, this command is used to create a repository, so that after which git can perform all its magic. In the same way, initiating is the most important step in your development journey. I didn’t have any previous coding experience before coming to IITR. And when I came to know about IMG, and the huge competition one has to go through to get into it, I was a bit scared initially. But then also, I decided to give it a try and made my mind that I will try my best to get in-it. So, always give a try to everything, despite caring much for its result. It all depends on your efforts, the more you make, the more you achieve. git clone In terms of its usage in git, this command is used to obtain a repository from an existing URL. In the same way, I contacted many seniors (existing URL :p) of IMG so that I could get some guidance on how to start, the best resources to learn to code, and many other such things. IMG follows the principle: “Help will always be given at IMG, to those who ask for it”. So if you don’t ask for help then even IMG can’t help you. So always, feel free to reach any of us, and the people here will always love to help you. git add This command is used to add one or more files to the staging area. In my case, it was to add new skills of coding. Remember that, learning any new skill is a bit difficult in the starting, but as you spend more and more time on learning, you will slowly get comfortable in it. So never hesitate to learn anything new. That’s what I learnt while preparing for recruitment. git commit This command records or snapshots the file permanently in the version history. In my journey, this was the time when I had given all my efforts and appeared for the recruitment process. With all the efforts that I have made for getting into IMG, I was all set for making my name permanently in the IMGians list. And then this email from IMG after a nerve-wracking process finally made my day. git merge This command is used to merge a different branch into your active branch. In my case, it was to learn from the very-awesome people at IMG and merge their skills to my skill set. IMG is not limited to only web-development. Once you step into IMG, there’s no end to learning new skills. Here you will meet people who are excellent in any of the domains of coding you can think of be it competitive programming, mobile development, information security, dev-ops and plethora of other things. It’s no doubt that you will find the inhabitants of IMG Lab to be fabulously dedicated hard-working people. Still, here we, work on the “Work hard party Harder” principle, so we do a hell lot of things other than coding like mafia sessions, movie-nights, musical workspace, ‘Tech Fridays’ — lectures on topics like cryptography, probability, stock-market analysis etc. to chill out. Winding Up To conclude, I believe that IMG has the most important role in making me the person I am today. From having no idea of programming to building awesome web and mobile apps, learning a plethora of new languages and other skills and getting the opportunity to work, learn and enjoy with the most awesome people here, it was a phenomenal journey for me. So, what’s stopping you from hopping into IMG is just a git init, so grab your terminal, hit the command, show us your dedication and become a part of us. See you at the recruitments! :)
https://medium.com/img-iit-roorkee/my-git-icious-journey-at-img-f18d1a8a99ef
['Sparsh Agrawal']
2020-12-19 14:35:27.595000+00:00
['Recruitment', 'Technology', 'Git', 'Web Development', 'Experience']
222
Battle Royale with Cheese (Headphones Edition): Beats Studio 3 Vs JLab Studio Pro
Battle Royale with Cheese (Headphones Edition): Beats Studio 3 Vs JLab Studio Pro Top of the line vs. bargain bin How much money do you need to spend on headphones? That’s a question I found myself asking on Black Friday, as I skimmed the deals on Best Buy’s website. After finally, finally settling on which devices I wanted to use on a day-to-day basis, I got back to the original tech “splurge” that had me frequenting Best Buy at the beginning of the year (back in the good ol’ days, when hand sanitizer was not something I had to think about constantly). Before I got into my indecisive phase with computers that has fueled the majority of my tech stories this year, headphones were the thing I was busy comparing; namely, I was looking for over-the-ear (or perhaps on-ear) headphones that I could use for hours at a time while working without the fatigue (or Tinnitus flare-ups) that I got from in-ear buds. And I looked at a plethora of headphones, from the cheapest JLab headphones to the more expensive options from Beats, JBL, and others. But a year later, when I finally came back around to wanting some over-ear headphones, only one brand came to mind again: Beats. This wasn’t because they were the best on the market; I knew that headphones from Bose and Sony would likely have better sound than Beats, which tend to favor bass a little more than some reviewers like. But being an Apple brand, they came with some unique benefits for someone who has gone all-in with Apple products as I have. And I remembered that they were quite comfortable. But I can never, ever be satisfied with buying one product and sticking with it. If you’ve read my other tech articles, you’ll know that my superpower is massive indecisiveness; I tried out a whopping 10 computers before deciding that the new MacBook Air with M1 was right for me, and I tried 6 different tablets before determining that the diminutive iPad Mini was all I needed in that space. So, naturally, I was gonna try a few different options before I settled down with one. In fact, even before we get to the Beats that I’m comparing in this story, I tried out the Beats Solo Pro, which Best Buy had on sale for $169 at the time. And I loved everything about them, from the color to the USB-C to the noise canceling. But I didn’t love how they made my ears feel like they were being pressed into my skull after a few minutes; in fact, when I took them off after an hour or so, my ears were in tremendous pain (I do wear glasses, which may be partly to blame as well). So this comparison will feature, primarily, the older Beats Studio 3 over-ear headphones, which retail still for $349 (but were on sale at Target for $175 when I got them). I’ve always had a soft spot for JLab, though. Their products are just so damn cheap, while still being good quality; I bought the JLab Go Air headphones for $30 and I never use them at all (I usually go for my AirPods when I want true-wireless portability), but I love that I have them just in case I need them. I have JLab’s retro Bluetooth headphones ($20) because they make me feel like Starlord. And I have a pair of JLab’s Studio on-ear headphones (which were also only $30) that I like to throw into my bag for trips. JLab hasn’t done that many over-ear headphones, however. Last time, I tried what was really their only major offering- their $100 Flex Sport headphones- but they just felt cheap by comparison to their other products. And even though I bought a pair of their Omni headphones (which they appear not to make anymore), I rarely use them, as I found the controls a little finicky and the ear-cups a little too big. But lo and behold, when I decided to go look at their website, I found that they now offered what they call Studio Pro headphones- a larger, over-ear version of the regular Studio headphones, with ear-cups that are large enough to cover your ears but small enough to still be portable (they retain the ear-shaped design of the Flex Sport while replacing the uncomfortable fabric with a more plush, leathery material). And they were $40. It was everything I wanted at a price I could easily stomach (especially since I’d just purchased the Beats). So, the only real question I had was this: which was better? Were the Beats worth the extra $310 (or extra $135, if you get them on sale)? There’s only one way to find out. FIGHT!!!! Build Quality Let’s start with the obvious: the design. Of course, there’s only so much you can do with the design of over-ear headphones, and, well, Beats did it with the Studio 3. There might be a reason Apple hasn’t bothered to update these in 3 years. The matte, soft-touch plastic feels sturdy and premium. The cushions are thick and soft. The controls are nondescript, with buttons built into the Beats logo and the surrounding ring of the left side that looks identical to the aesthetics of the right (although I, like other reviewers, do wish that this ring worked more like the scroll ring on the old iPods). Tiny, white LED lights indicate the battery life and whether ANC is turned on. My gripes with the design of the Beats are few, but they are, unfortunately, present. The power button, to me, sticks out like a sore thumb compared to the hidden play/pause and volume controls, and yet it is so small that it is difficult to press. I find myself having to try the double-press to enable/disable ANC multiple times to get the function to work, and a couple of times I’ve accidentally turned off the headphones rather than achieving the settings change. I’m also not the biggest fan of the folding function, but that is entirely personal; the first time I unfolded the Beats Solo Pro (which have a very similar folding mechanism), I horribly pinched my hand in the mechanism and it was painful as all hell. Given that they fold right in the place where I am prone to grabbing headphones, I feel like this is going to happen again. And the headband, while feeling entirely premium, does also feel rather fragile; I’ve seen plenty of Beats that were snapped in half hanging on the displays at Best Buy in the past, and they definitely feel like one wrong move will irreparably damage them (but you can get Apple Care for them, so…). And while I love that the Beats include a wire to use with a headphone jack, they still need battery power to use this, which I think is dumb; I used the JLab Omni headphones for weeks with a dead battery and a 3.5mm cable before I finally decided to charge them. And speaking of charging, well, I know I said Apple hasn’t updated the design of these things in 3 years, but they could at least have released a refreshed version that used USB-C or even Lightning instead of the supremely outdated Micro USB. Seriously, I hate the idea of having to carry a separate cable for these headphones. With design, more often than not, you get what you pay for. That said, while the JLab Studio Pro headphones definitely look cheaper than the Beats, they don’t look bad at all. As mentioned earlier, the ear-cups have a more tapered design, mimicking more the shape of your ears, and there’s a nice blue liner inside the cups for accent. The cushions aren’t as thick, but honestly, wearing them feels more comfortable than the Beats, which have a tendency to feel like they are squeezing my head a bit (though nowhere near as unbearably as their Solo Pro cousins). The power and volume controls are located on the rear of the right ear-cup, and while definitely more noticeable, they are somewhat easier to find with your finger (I often forget which side of the Beats house the controls, and since I’m right-handed, I instinctively try to press the right Beats logo to play/pause, which does absolutely nothing). The areas where you’ll really notice that the JLab headphones are a cheaper product is in how the ear-cups are connected to the headband. A thin, metal wire holds each cup and serves as the extensions for fitting them on your head. It feels sturdy, but I can imagine that the wire could get pulled out or misshapen in a bag if you aren’t careful. And unlike the Beats (and other JLab headphones), these don’t come with any sort of carrying bag or case to protect them (though I’ll never use the ones that come with the Beats Studio 3; it is bulky as hell). But for me, this wire design has one benefit: since it juts out over the folding mechanism, I don’t think I’ll ever have to worry about pinching my hand. As for the headband, while it is thinner in the realm of padding than the Beats, it feels a bit sturdier; I don’t exactly know what is inside, but I’d wager it would withstand being bent out of shape more than the Beats. Speaking of wires, though, the cables that connect the ear-cups to provide power are exposed and just kind of float between the ear-cups and headband, where the cable for the Beats is flush inside the headband. Unlike the JLab Omni, the Studio Pro don’t have an adapter cable for use with the ever-disappearing headphone jack, but unlike the Beats, they charge with USB-C. It would be easy to say the Beats Studio 3 are the winners here because they do look nicer than the JLab Studio Pro. But I don’t think they look that much better as to justify the way higher price tag. I think that despite looking a little cheaper, the JLab Studio Pro look damn good for $40 headphones, and they feel good, too. And honestly, if I had to pay $40 again in a year to replace them because the wire connecting the ear-cups to the headband got bent or broken, that would still be cheaper than buying the Beats Studio 3 once. The Jlab Studio Pro headphones come in matte black, while the Beats Studio 3 come in a variety of colors, including matte black, black or white with gold accents, red, blue, and a few other exclusives depending on where you buy them, which is nice if you want your headphones to stand out (for my comparison, I am looking at the matte black). Winner: JLab Studio Pro. The Beats are nicer, but as I said, I don’t think they are $310 nicer. $40 gets you some damn nice-looking headphones from JLab, and that’s nothing to shake a stick at. Sound Quality Beats has, for a long time, thrived on their sound profile. They can be a little bass-heavy for some reviewers, but for me, they sound excellent. End of review. Just kidding. I love how the Beats sound; it is the primary reason I revisited them a year after first trying them out. In fact, I made a playlist on Apple Music called “Fantastic Beats and Where to Find Them” that is filled with songs that I think sound just perfect on the Beats Studio 3 (and others, like the Solo Pro and even the wired Beats EP). But the sound profile was only half of the reason I wanted to give the Studio 3 (and the Solo Pro before them) a shot; the other was ANC. 2020 has been a bitch of a year, and one of the side effects is that I’m now working from home within earshot of my wife and our television. Most days, I can focus on work, but lately, I had wanted to find some noise-canceling headphones to help me pay attention to what I’m working on (I’m not saying my wife is loud… but the TV can be). Up until now, I’ve used Apple’s AirPods Pro or a pair of Sony in-ear headphones that have passive noise canceling. The problem there was that in-ear headphones began to hurt after an hour or two, or the AirPods Pro would fall out (I had to get some third-party ear-tips to fix this), or the headphones just wouldn’t last the full 8 hours I needed them for. I found myself constantly taking them out and not using them. So I knew I needed something that was on-ear or over-the-ear. I preferred the latter because I find over-the-ear to be more comfortable for long periods of time. Since the Beats Studio 3 have Active Noise Cancelling, just like the AirPods Pro, I decided that I needed to give this a shot. And well… it works. Kind of. When sitting in the same room with the TV on at our normal volume, it reduces- but doesn’t eliminate- the TV sound. This means that whatever I’m listening to can generally be played at a lower volume, which is better for my ears and better for focusing on work rather than my music. The ANC on the Beats isn’t quite as good as the ANC on the AirPods Pro or other headphones on the market, but it is good enough for my needs, and the difference is definitely noticeable when you turn it off. It isn’t all good news regarding ANC, at least not for me. When I first got the AirPods Pro, I found that ANC was giving me headaches after using it for an hour or two, and while Apple seems to have fixed this a little bit with the AirPods, I’ve found that this is present with the Beats as well. It doesn’t happen all the time, and it is a very minor headache, but it is noticeable, going down into my neck and making my head feel like it is ever so slightly in a vice (and one that doesn’t go away immediately after I take off the headphones). I chalk this up to my Tinnitus more than an issue with the ANC itself, but I know I’m not the only one who has complained about headaches with ANC, so it is worth bringing up if you have had similar experiences with other ANC headphones. The JLab Studio Pro, simply put, do not have ANC. And I’m not bummed by that. I tried JLab’s Studio ANC headphones last year and I found that they were so-so; the ANC worked decently well, but it sacrificed the quality of the sound coming from the headphones themselves. And it was nowhere near as effective as the ANC on the Beats. That said, even though the Studio Pro headphones are only passively canceling noise (and they don’t passively cancel a whole lot because they aren’t quite as snug as the Beats), once I have something playing, I can only barely hear the TV in the background, and I can focus on my work just about as easily as I can with the Beats with the ANC turned on. Out of the box, the JLab headphones don’t sound quite as good as the Beats. But JLab has a few sound profiles that you can switch to- JLab Signature, Balanced, and Bass Boost. When comparing them to the Beats Studio 3, I prefer Bass Boost, and I honestly can’t really tell any major difference between them on this profile. Maybe your ears are more discerning than mine, and maybe, maybe the Beats sound a hair better. But- and this is turning into a recurring theme with this comparison- they definitely don’t sound $310 better. Not by the longest of longshots. I don’t know what voodoo JLab does to get sound this good out of $40 headphones, but they need to keep up the good work. JLab also has an app that features a Burn-In tool to help tune your headphones for even better sound (though I suppose you could also use this app with the Beats if you wanted to). It is worth noting that the Beats do get louder than the JLab Studio Pro; in order to achieve the same sound that I got out of the Beats at around 50% loudness on my iPhone, I had to have the Studio Pro volume up to around 65–70%. If you want the max volume out of your headphones, the Beats are going to deliver it better, but for the health of your eardrums you really shouldn’t listen to any pair of headphones at max volume for too long; to reduce the risk of hearing loss, you shouldn’t go above 80 decibels for more than 40 hours a week. According to the Apple Health app, at around 50% volume, the Beats were delivering around 71 decibels, and at max volume, I was able to get it to 87 decibels (my volume settings on my iPhone are set to max out at 90 decibels). I was unable to get the JLab headphones to report to the Apple Health app (more on that in a moment), so I had to get tricky; I played the same song at the equivalent volume (well, the 65–70% to achieve the same loudness) next to my Apple Watch, and received a similar 73 decibels (this may not have been as accurate as the Apple Health app, but I did the same thing with the Beats as a control to make sure my watch was providing similar results to what Apple Health was recording). Likewise, I was able to cap it at 87 decibels with my current settings. Sidebar: If you have an iPhone and hearing loss is a concern of yours, you can turn on alerts in the Health app to notify you when you’ve been listening to loud music for too long. The final thing I want to talk about with these headphones regarding sound quality is their connectivity. Both use Bluetooth, however, if you are using Apple products, the Beats definitely have a leg up in Apple’s W1 chip. This is the older chip, so it doesn’t work with Apple’s new smarts that will automatically transfer your H1-equipped Apple headphones to the device you are currently listening on (this seems to be a little half-baked anyway), but it does mean that once you’ve paired it with one Apple device, it is connected to your Apple ID and therefore can be instantly paired to all of your other devices. While I was able to pair the JLab to my MacBook, iPad, and iPhone, moving them back and forth was more of a hassle; I’d first have to disconnect them from the device they were currently connected to in order to connect them to another. Honestly, that isn’t so bad, but I’m admittedly spoiled by how simply the Beats and my AirPods simply switch between my devices. Additionally, I’ve had a few instances where there’s a second lag using the JLab headphones when watching videos. So far I’ve only noticed this when using the Hulu web app in Safari, and it is intermittent and can be resolved by pairing the headphones again. This could be a problem with either Hulu or the new M1 MacBook as well; I’ve had other issues with Hulu recently (it likes to skip to halfway through the next episode of M*A*S*H while I’m in the middle of an episode) and I’ve heard reports of Bluetooth issues with the M1 MacBook Air (in fact, both the JLab and Beats have experienced disconnection issues when in use with the MacBook). Long story long, I’m not gonna hold this against the JLab headphones. Winner: This is a tougher choice than I thought it would be. I mean, the Beats are some of the best headphones on the market, right? But I can’t knock how good the JLab headphones sound at a fraction of a fraction of the price. If my biggest gripe was swapping the headphones between my devices, it would still be cheaper to buy a pair of the JLab headphones for each device I use rather than to buy one pair of the Beats. With all that in mind, the only thing that I think the Beats truly wins out with is ANC, because it does reduce the volume level that I need to drown out the TV while I’m working. I’m going to give this one to the Beats Studio 3, but just barely, and really only if ANC is going to make a major difference to you. If not, save your money, buy the JLab headphones, and don’t ever look back. Bonus Features Ok, so some of this will be a rehash of things we’ve touched on before, but I want to be thorough, and I feel like some of these features deserve more of a call out than what I’ve given them so far. Granted, this category is going to be heavily Beats-focused, as they have more bells and whistles. In fact, the biggest whistle is the W1 chip- that is, if you use Apple products. I’ve already talked about ANC, which the JLab do not have, and I’ve talked about how the W1 chip makes switching between devices a breeze, and while it is a step behind the H1 chip in newer headphones from Apple, it is still two steps above anything else (again, only if you use Apple products). It provides a more stable connection with the devices, too, allowing me to travel just about anywhere in my house without having to worry about walls interfering with my signal. That said, the JLab Studio Pro come with Bluetooth 5, which also allows for greater distance between headphones and device as well, and this works with all devices, not just ones with the Apple logo. I was able to get the same distance from my iPad with the JLab headphones as I was with the Beats. I can’t for the life of me find anything that says whether the Beats Studio 3 have Bluetooth 4 or 5, but for the price, I would hope they have 5. But the special Apple bonuses don’t stop there. I love that the Beats Studio 3 will report to Apple’s Health app regarding decibel levels, making them another point of contact to get a bigger picture of your overall health with Apple hardware and software. Third-party headphones are supposed to be able to report decibel levels as well, and in the past, I’ve had other brands provide this data, but I’ve yet to get the JLab Studio Pro’s to do it. Luckily, other features regarding hearing health are built into the iPhone itself and not the Beats- such as capping the decibel level or alerting you if the volume has been too high for too long- so the JLab headphones should be able to take advantage of this as well. I’ve mentioned that the Beats come with a wired adapter for use with a headphone jack (if you got one), and that’s a nifty feature. The included wire has in-line controls, though I’m not entirely sure why; when you plug in the headphones, the controls on the side of the Beats are disabled, but this seems like an unnecessary complication- I figure Apple could just as easily not included the in-line controls and left the controls in the headphones powered on. I mean, the headphones- including the ANC option- stay on while you are using the wire, so it seems odd that you would have separate controls for wired operation versus how you normally use them when they are wireless. It would make sense if the headphones were powered off while they were plugged in, but as I’ve mentioned before (and I’ll mention again in a minute), the headphones for some reason need battery life in order to use the wire. Every other pair of Bluetooth headphones I’ve used that have a wired option can use the wire without any power whatsoever in the headphones themselves. Granted, ANC definitely won’t work without the power, but I figured the Beats should at least be able to play sound while out of juice. And with Apple quickly killing the headphone jack on most of their products (only MacBooks and the basic iPad and iPad Mini still have them), so I find this inclusion more baffling than anything else. And I don’t once expect to actually use the wire. I suppose maybe, maybe, if you were planning to use the Beats with a stereo system or an old computer that doesn’t have Bluetooth, it might make sense… but in that case, there are better- and cheaper- wired headphones to get for those specific use cases. This just feels half-assed, which I think is odd for Apple. Ok, so like I said, this one is very heavily focused on the Beats. That’s because outside of being a quality pair of Bluetooth headphones, the JLab Studio Pro headphones simply don’t have a lot of extra things to do. And for $40, that’s fine. Costing only $40 is actually a pretty nice perk all on its own. The only other bonus feature I can really talk about with the JLab headphones is the three different sound profiles that come built-in. I mentioned that the Bass Boost profile gives you the closest sound to the bass-heavy Beats, but if that’s too much bass for your comfort, the JLab Signature profile is a clear second best. I’d avoid “Balanced” if at all possible, as it seems to flatten everything to “balance” it out. The Balanced profile is the only one in which I think these headphones actually sound like they are only forty bucks. Winner: So the Beats Studio 3 clearly have more bells and whistles, and so by default, they’ll win this category. But I really think you should consider whether those extra features are something you’ll use. If you don’t need ANC if you don’t need (or use) the Apple Health features, and especially if you don’t have Apple devices to take advantage of the W1 chip, then you really shouldn’t consider the Beats. The JLab Studio Pro will suit you just fine. Charging and Battery Life We’re in the home stretch. We’ve talked about what the headphones look like. We’ve talked about what they sound like. All that’s left is how long you’ll be able to listen to them. Apple rates the Beats Studio 3 for up to 22 hours with ANC or up to 40 without it. That’s quite respectable, and with ANC I primarily got about two days of mixed usage before I had to charge them again. Charging is relatively quick, too. 2 hours will get you a full battery from dead, but just 10 minutes can get you up to 3 hours of use in a pinch. As previously mentioned, however, that charging comes via a Micro USB cable that comes included in the box. Grrrrrrrrrrr. I suppose it was forgivable in 2017 that these headphones used Micro USB, but now that nearly everything but the cheapest Android phones use USB-C (or Lightning, in Apple’s case), I loathe that Apple hasn’t at least given the Studio 3 Beats the minor upgrade of a USB-C or Lightning port (for reference, the newer Solo Pro charge with Lightning). I’m not asking for a complete redesign like the Solo Pro- although being 3 years old, the Studio line is probably due a big refresh soon- but they could have at least brought the charging mechanism into the modern age. Enough griping. Apple at least supplies you with a generously long Micro USB cable (though no wall adapter), which is more than we can say for JLab (getting there). As mentioned earlier, the Beats do also come with a cable for using the headphones with a headphone jack (if you even have a device that has a headphone jack), but this cable will not work without battery life in the Beats, so don’t think you’ll be able to use these when they are completely dead, unlike nearly every other pair of headphones that comes with a headphone jack cable. Ok… now enough griping. So, onto the JLab Studio Pro. JLab rates the battery life at a massive 50 hours. Simply put, I’ve not had to worry about charging them at all since they arrived. JLab says that it will take longer to charge, however; 3 hours to get from zero to full, and 10 minutes will get you only an hour of use. In the box, JLab supplies the dinkiest little USB-C cable (and again, no wall adapter). It looks well-made, but it is super short. This doesn’t really matter if you have literally anything else that uses a USB-C charger, though, and I suspect JLab knows this. Whenever I do eventually need to charge these headphones, I’ll be using my MacBook Air charger to do it. And I absolutely love that convenience (ahem… Apple). One thing I’ve noticed regarding battery life is stand-by time. I had this problem with the BeatsX in-ear headphones, and it seems to be prominent with the Beats Studio 3 as well: stand-by time sucks. I’m getting around 20 hours of life from them whether I’m using them or not, and I’m having to charge them up daily. Throughout my time with both pairs of headphones, I’ve used the JLab Studio Pro more often and I’ve only charged them once; they’ve been sitting idle- but powered on- for the last two days and they still have around 70% battery life (this percentage is according to Apple’s battery widget; the headphones themselves will only tell you if the battery is “full”, “medium”, or “low”. Likewise, I charged the Beats to 100% last night, used them for maybe an hour at most, and then left them sitting on my desk powered on. Not only did they power themselves off at some point (which is nice, if it conserved any battery life), but about 12 hours later they have already dropped to 52% (in the hour or so I spent finishing this story since writing that sentence, it dropped to 40% with zero use at all). I imagine that this is due to the ANC; whenever I pick up the Beats, even when I don’t have anything playing on them, I can tell the ANC is on, and likewise, this has to be draining power. Personally, I am one who forgets to turn off my headphones all the time, so that really is a bit of a deal-breaker for me that the Beats don’t seem to have any sort of low-power mode to power off ANC and keep them from draining too much when not in use. Again, this probably can be blamed on the fact that these headphones haven’t been updated in a few years; the newer Beats Solo Pro do feature both a low-power mode when they aren't being used and will automatically turn off when you fold the headphones up, so I imagine a forthcoming Beats Studio 4 or Studio Pro or whatever comes next for Apple’s over-the-ear headphones will have some of these newer bells and whistles. Winner: With longer battery life and USB-C charging, JLab’s Pro headphones clearly win this one. The Beats will charge faster when you do charge them, but you’re going to be charging the Jlab’s far less often. And the JLab Studio Pro are definitely going to live longer if you forget to turn them off. $40 Vs. $349 So, did the Beats Studio 3 make an argument for their higher price tag? Honestly, I don’t think so. Definitely not in 2020, and definitely not if you are paying full price for them; even at the sale price I got them on for $175, it is a hard sell with the $40 JLab headphones being almost as good- if not better- in every category. The biggest benefit to them is the ANC, but for that price (either the sale one or the regular price), you could get the Beats Solo Pro headphones that come with Apple’s newer H1 chip and have some good battery improvements, like charging with Lightning and both a low-power mode and easy power off for forgetful people like me who don’t turn off their headphones. The only reason those weren’t the ones I reviewed here is because they hurt my ears after a while, and unlike the JLab Studio Pro, they were on-ear headphones. Over the duration of this review, even though ANC proved useful while I was working, I found myself reaching for the JLab Studio Pro way more often, as they were just more comfortable, sounded very nearly as good, and frankly I knew that they’d have the juice to get me through work no matter what. I think the Beats Studio 3 are excellent headphones. But with $40 headphones being this good, they are not worth the price, no matter how good the sale is.
https://medium.com/the-shadow/battle-royale-with-cheese-headphones-edition-beats-studio-3-vs-jlab-studio-pro-6a157f07d566
['Joshua Beck']
2020-12-13 18:23:33.176000+00:00
['Technology', 'Gadgets', 'Headphones', 'Tech', 'Apple']
223
Emanate ICO Review
Emanate ICO review We are so thrilled to share the Emanate ICO review. The entire music industry has been witnessing a wave of new developments as over the past two decades modern and advanced technologies have provided for innovative digital means of accessing, streaming, and promoting music. However, for artists and music creators it hasn’t been easy as they face several pertinent challenges including restricted access to the markets, cross-border differences in royalty collection rights, lack of trust between independent musicians and majors, unfair profit distribution, piracy, and more. The Emanate Project aims at creating an alternative music industry ecosystem capitalizing on the blockchain technology to make it easy for all artists and musicians to promote their work and get rewarded. What makes Emanate even more interesting is that they are not merely addressing a certain particular problem but they are creating a whole new ecosystem powered by native MN8 tokens, the effective currency of the Emanate ecosystem. While the Emanate ICO registration process has already begun, the project has caught our attention and so here’s our detailed Emanate ICO review making a thorough analysis of this intriguing project. Emanate Is Creating A Decentralized Autonomous Economy For Music Industry The core idea that drives the Emanate project is bringing the artists and the music creators to the forefront and ensuring that they get fair and quicker compensation for their creativity. The present-day dominating models within the music industry have failed to ensure transparency and have led to diminishing profits and delayed and disproportional payments for artists. Also, with several processes and intermediaries involved, there is a looming threat of unsustainability. With Blockchain technology, the emanate project will eliminate such processes and help create a fully decentralized autonomous economy for music industry which will not only ensure more transparency and improved quality but will also ensure fair and immediate discharge of remunerations for artists. The Emanate white paper explains: “The concepts behind Emanate were born from the realization of the power of blockchain technology to enable effective collaboration between artists, and what cryptocurrency can do for micropayments between listeners and artists. We started work designing a set of collaboration and monetization tools that would enable a hotbed of collaboration and rapid monetization directly with listeners.” A One-Stop Audio Exchange Platform Emanate platform will be developed as a one-stop audio exchange platform that will allow artists to share and promote their music while allowing them to retain all rights to their publications and will facilitate monetization of the music shared on the platform. It will also help with effective collaboration and promotion. To listeners, it would provide high quality and latest audio tracks. With the tokenization of the platform, users will also get an exclusive opportunity to earn MN8 tokens on the platform by participating in a range of activities and contributing to the network. In a nutshell the emanate platform will be that one place one place where a piece of audio can be placed for use on any platform and payment for every play. Responding to what Emanate platform will be like Emanate CEO, Sean Gardener, says “ The revolutionary Emanate platform will be like Like Soundcloud but with micropayment monetization similar to Steemit. You can think of it as Splice but more open and with the power of crypto/blockchain enabling microtransactions and a song-creation data layer.” Emanate Has Strong Use Cases The Emanate ecosystem has a strong use case for all listeners and other stakeholders within the music industry like radio stations, record labels, and music creators. For music producers working with several artists the emanate platform will make it easy to manage their portfolio by allowing seamless storage of files and also benefiting from the royalties and earnings through a smart contract with all transactions duly recorded on the blockchain. For radio stations, it will help them find new artists as the Emanate Platform will host the latest and original tracks from artists all across the globe. Individual artists or music companies can create their profile on the Emanate Platform and share and promote their music to a larger audience. Also, they can publish their work to Spotify, iTunes, soundcloudTM and to other in-venue broadcasters. Most importantly the artists get paid their royalties in real-time without having to wait for months. The Emanate (MN8) Tokens The ecosystem will be supported by the MN8 tokens developed on top of the EOS blockchain. The tokens will have a range functions and use across the platform like Artists will need MN8 tokens to publish music for monetization, listeners may require MN8 tokens to consume audio and also Network node holders will need to stake MN8 tokens and may also receive tokens as payment in return for the node operation services. Moreover, the tokens can also be used as payments for a multi-tier subscription for all users. The Emanate ICO Team The Team behind Emanate is impressive and particularly experienced and the key members have either categorically handled successful projects or have contributed effectively towards one. Emanate CEO, Sean Gardner, Sean has spent the last 12 years creating digital brand communications and technical platforms for top global brands and agencies. He has launched Augmented Reality projects as early as 2010 and produced Australia’s first branded Virtual Reality experience in 2014. Their CFO, Trent Shaw, has over 14 years experience commercializing digital platforms in the music and entertainment space including years with Sound Alliance, Moshtix, and eBay. Thomas Olsen, contributing as Head Of Music at Emanate, is a Grammy Award-nominated musician, DJ, and producer well known for his ‘Tommy Trash’ project which has taken him to the biggest stages in the world over the last 10 years. Thomas has worked with the likes of Tiesto, Ingrosso, and Digitalism as well as remixing Zedd, Deadmau5, Swedish House Mafia, David Guetta and Empire of the Sun. The Emanate ICO Details The Entire Emanate ICO starting Q3 of 2018, will be launched in three phases, a strategic private sale, and a token crowdsale event. The private sale is for select institutional investors and music industry business partners only. The registration for the pre-sale is ongoing and users have a follow an e-KYC registration and get whitelisted to participate in the MN8 token sale. The total number of tokens to be created is fixed at 208,000,000. A total of 40 million tokens will be made available during the pre-sale stage and the MN8 tokens can be purchased at $0.1 USD per token. The ICO main sale or the crowdsale will have a total of 48 million MN8 tokens available for sale and the token price will be $0.12 USD per token, 2% higher than the pre-sale stage. The hard cap is fixed at $12 MILLION USD and a soft cap is fixed at $5 million USD. The Emanate platform has a unique emanate sustainability fund and all the unsold tokens will be allocated to this fund. The allocation details of the proceeds: Advantages The Clear Roadmap — The Emanate project has a very effective, well planned and articulate roadmap including a detailed whitepaper making very realistic claims in terms of project development, which clearly reflects the efforts the team has put into making it a successful venture. Strong Use Cases — The problems facing the present-day music industry is well established and with effective use of the blockchain technology, a lot can be solved. By creating an alternative token economy along with a robust platform allowing for sharing and promoting music Emanate can be a powerful mechanism disrupting the existing models and thus creating a space for themselves in the multi-billion dollars industry. The MN8 tokens have a defined use on the platform and have all potentials to generate value for users as the project develops its Beta platform. Impressive Team — The lot behind Emanate are professionals with an impressive track record of handling successful projects in the past. The collaborative and cumulative efforts of such experts can very positively lead to another successful project and their clarity in their disposition so far is a very strong indicator of the same. Concerns Stiff Competitions — With evolving technologies, there have been several attempts to create an ecosystem that addresses the same core problems like those the Emanate aims to solve. While with blockchain technology Emanate has a certain edge over others it will still be competing against many other projects with similar ideas. Extensive Marketing — The project will have to focus on an extensive and targeted marketing strategy. On a positive note to address this challenge, the Emanate team has already started getting into partnerships. The Verdict While there has been organized attempts to address the problems facing artists and music industry on the whole, Emanate is pioneering the idea of creating an alternative economic model catering to the music industry using the advanced EOS blockchain technology. The project has reported steady development and the detailed proposed solutions seem convincing and very much realistic both in terms of addressing the issues, and scalability. The defined revenue streams coupled with very strong use cases make Emanate a project that is bound to catch your attention, and more so if you’re a musician, a music producer, or a music lover. Visit the Emanate website: https://emanate.live/
https://medium.com/eliteclub-io/emanate-ico-review-2cbc387a43b5
['Max Neuhaus']
2018-10-15 10:50:11.445000+00:00
['ICO', 'Cryptocurrency', 'Blockchain Technology', 'Ico Review', 'Blockchain']
224
Blockchain and Its Applications for Service Designers
Blockchain and Its Applications for Service Designers Blockchain might be one of the most notable technological innovations of the last decade. It has recently gained immense popularity in innovator communities, as it can solve some fundamental challenges recurring in many industries. Among many things, blockchain can inject transparency in operations, address data privacy and security challenges and solve the issue of trust in transaction handling. Considering this, it is particularly important for product, service and system designers to understand the technology and be aware of the opportunities that it provides. In this article, I will try to explain the essence of blockchain technology and bring practical examples of its applications across industries. Let’s go! 🤗 What is Blockchain? Blockchain technology might seem complicated, but its core concept is rather simple. Blockchain is a type of database. To understand the technology, it‘s important to first understand what databases actually are. A database is a collection of information that is stored electronically on a computer system. In databases, data are typically structured in a table format to allow for easy search and filtering. Blockchain differs from a typical database in the way it stores information; blockchains store data in blocks that are then chained together. Blocks have certain storage capacities and, when filled, are chained onto the previously filled block, forming a chain of data known as the “blockchain.” All new information that follows that freshly added block is compiled into a newly formed block that will then also be added to the chain once filled. The data are chained together in chronological order. Blockchain are inherently distributed databases, meaning that many parties hold copies of the ledger (a.k.a. peer-to-peer networks). In a blockchain, each node has a full record of the data that has been stored on the blockchain since its inception (this is done with something called timestamps based on the use of hash-based proof of work). If one node has an error in its data, it can use the thousands of other nodes as a reference point to correct itself. This way, no one node within the network can alter information held within it. This system helps to establish an exact and transparent order of events. It can be used in a decentralized way so that no single person or group has control over it — instead, all users retain control collectively. Immutability and transparency are not the only characteristics making blockchain technologies trustworthy, secure and innovative from technological perspective. Blockchain technologies use cryptographic hash functions and assymetric encryption, which uses a mathematically related pair of keys for encryption and decryption: a public key and a private key. If the public key is used for encryption (read — send), then the related private key is used for decryption (read — receive/access). If you are scared at this point, note that as a designer, you do not need to understand, or implement the technical aspects of the blockchain, however, having an understanding of the basic concepts and the way they are used is essential for developing functional and innovative solutions based on the technology. What value does blockchain offer and how can it be applied? Blockchain’s core advantages are cryptographic security, decentralization, transparency, and immutability. The technology allows information to be verified and value to be exchanged without having to rely on a third-party authority. Blockchain does not need to be a disintermediator to generate value. Benefits from reductions in transaction complexity and cost, as well as improvements in transparency and fraud controls can be captured by existing institutions and multiparty transactions using appropriate blockchain architecture. There is no singular form of blockchain — the technology can be configured in multiple ways to meet the objectives and business requirements of a particular use case. So far, cryptocurrencies remain the most well-studies and common blockchain applications. In fact, Bitcoin, the most traded cryptocurrency created by Satoshi Nakamoto in 2009, was one of the first and probably the most well-known applications of blockchain technology. However, the real value of blockchain technology goes beyond cryptocurrencies and the financial sector, in general. McKinsey has categorized blockchain uses into 5 main categories: Static registry: distributed database for storing reference data distributed database for storing reference data Identity: distributed database with identity-related data (it’s actually a particular case of static registry with an extensive set of applications). distributed database with identity-related data (it’s actually a particular case of static registry with an extensive set of applications). Smart contracts: a set of conditions recorded on a blockchain triggering automated, self-executing actions when these predefined conditions are met. 🙌🏻 Smart contracts are my favorite category! I highly recommend to watch this TED talk about smart contracts and how to make the world a more beautiful, functional and just place Dynamic registry: dynamic distributed database that updates as assets are exchanged on the platform. dynamic distributed database that updates as assets are exchanged on the platform. Payment infrastructure: dynamic distributed database that updates as cash or cryptocurrency payments are made among participants. There are also other edge cases that do not fit into the categories, or are combinations of these (e.g. Blockchain as a Service, ICO’s). How is blockchain used by different public and private organizations? Each of the categories mentioned by McKinsey can be broken down into specific cases based on industries or functions. Supply Chain Management Blockchain’s immutable ledger makes it well suited to tasks such as real-time tracking of goods as they move and change hands throughout the supply chain. Using a blockchain opens up several options for companies transporting these goods. Entries on a blockchain can be used to queue up events with a supply chain — allocating goods newly arrived at a port to different shipping containers, for example. Blockchain provides a new and dynamic means of organizing tracking data and putting it to use. Food industry For example, IBM and Walmart have partnered in China to create a blockchain project that will monitor food safety. Similarly, Louis Dreyfus Co, is experimenting with soybean import operation using blockchain. Shipping industry Another example is the case of logistics giant Maersk that has experimented with a blockchain-based project in the maritime logistics industry. Mining industry In the mining industry, The De Beers Group is using blockchain to track the importation and sales of diamonds. Inventory management In inventory management, Russian rail operator Novotrans is storing inventory data on a blockchain pertaining to repair requests and rolling stock. Healthcare As connected medical devices become common and increasingly linked to a person’s health record, blockchain can connect those devices with that record. Devices will be able to store the data generated on a healthcare blockchain and append it to personal medical records. MedRec is one of the pioneering projects experimenting with storing data on the blockchain. Government Blockchain has a plethora of application in the public sector, in particular for transparent government operations. Voting Blockchain technology has the ability to make the voting process more accessible while improving the security. Hackers are no match to blockchain technology, because even if someone accessed the terminal, they wouldn’t be able to affect other nodes. Each vote would be attributed to one ID. This is particularly helpful for ensuring transparent and just elections For example, in Switzerland, voter registration is being facilitates via a blockchain project spearheaded by Uport. Taxation Blockchain tech makes the cumbersome and human error-prone process of tax filing much more efficient with enough information stored on the blockchain. For example, In China, a tax-based initiative is using blockchain to store tax records and electronic invoices led by Miaocai Network. National Security The immutable nature of blockchain, and the fact that every computer on the network is continually verifying the information stored on it, makes blockchain an excellent tool for storing big data. For example, for the past two years, the US Department of Homeland Security has been using blockchain to record and safely store the data captured from its security cameras. Insurance In the insurance industry, blockchain-based smart contracts allow customers and insurers to manage claims in a transparent and secure manner. All contracts and claims can be recorded on the blockchain and validated by the network, which eliminates invalid claims, since the blockchain by default rejects multiple claims on the same accident. For example, American International Group Inc. uses a smart contact-based blockchain as a means of saving costs and increasing transparency. Energy Blockchain technology could be used to execute energy supply transactions, but also to further provide the basis for metering, billing, and clearing processes. Other potential applications include documenting ownership, asset management, origin guarantees, emission allowances, and renewable energy certificates. One example is the case of Chile. As Chile’s National Energy Commission seeks to update its electrical infrastructure, it has started to use blockchain technology as a way of certifying data pertaining to the country’s energy usage. Another example is The Energy Web Decentralized Operating System (EW-DOS). It is an open-source stack of decentralized software and standards — including the Energy Web Chain, middleware services, and software development toolkits (SDKs). EW-DOS is a shared technology running on a decentralized network maintained by many respected energy companies. EW-DOS supports two primary use cases: 1) clean energy and carbon emissions traceability and 2) using distributed energy resources to increase grid flexibility. EW-DOS leverages self-sovereign decentralized identifiers, a series of decentralized registries, messaging services, and integrations with legacy information technology (IT) systems to facilitate transactions between billions of assets, customers, grid operators, service providers, and retailers. Real Estate The average homeowner sells his or her home every 5–7 years. With such frequent movement, blockchain is particularly useful in real estate market. It can make home sales more efficient by quickly verifying finances, reducing fraud thanks to its encryption, and offering transparency throughout the entire selling and purchasing process. For example, Propy, in Kiev uses blockchain to complete real estate deals. Payments Blockchain offers a way to efficiently and securely create a tamper-proof log of sensitive activity. This makes it particularly relevant to international payments and money transfers. For example, Banco Santander money transfer service “Santander One Pay FX,” uses Ripple’s xCurrent to enable customers to make same-day or next-day international money transfers. The blockchain ledger that Ripple uses had been latched into by a group of Japanese banks, who will be using it for quick mobile payments. Environment Protection of endangered species In the environmental sector, the protection of endangered species can be facilitated via a blockchain project that records the activities of these rare animals. One example is the Newton Project. Newton uses NewSensor technology consisting of small IoT devices for monitoring location, temperature, air quality, humidity, etc, and uploads that data to NewChain, Newton’s blockchain. By inserting a NewSensor under the skin of an endangered mammal, for example, a rhino or elephant, it is possible to track the location and basic behavior of that animal. Fishing One application in fishing is the provision of a transparent record of where fish were caught to ensure legal landing. For example, The WWF project uses a combination of radio-frequency identification(RFID) tags, quick response (QR) code tags and scanning devices to collect information about the journey of a tuna at various points along the supply chain. Carbon Offset The monitoring of carbon offset trading is another environmental application of blockchain technologies For example, IBM is using the Hyperledger Fabric blockchain in China to monitor carbon offset trading. Waste Management Waltonchain is using RFID technology to store waste management data on the blockchain in China. Record Management The encryption that is central to blockchain makes it quite useful for record management because it prevents duplicates, fraudulent entries, and the like. Land Registry Record management can be particularly useful in land registry. For example, in Georgia, in a project developed by the National Agency of Public Registry, land registry titles are now being stored on the blockchain. Border control In the Netherlands, Essentia has developed a blockchain-based border control system that allows customs agents to record and safely store passenger data from an array of inputs. Blockchain as a Service Enterprises also benefit from blockchain technology by developing models offering blockchain as a service solutions. For example, Ethereum’s blockchain can be accessed as a cloud-based service courtesy of Microsoft Azure. Similarly, Google is building its own blockchain, which will be integrated into its cloud-based services, enabling businesses to store data on it and to request their own white label version developed by Alphabet Inc. A concluding note for service designers In this article, I tried to explain the basics of blockchain technology and give some examples of its use cases in different service sectors. One of our functions as service designers is to devise new business models, processes and infrastructures to help stakeholders exchange value and achieve their goals in more mutually beneficial ways. On the other hand, we are responsible for anticipating the threats that blockchain-based products, services and systems might create. Considering this, it is crucial for us to understand the nuts and bolts of the technology and be aware of its use cases. I hope, I managed to motivate you to learn more about one of the most promising technologies of the last decade. P.S. Note that the use cases that I covered in the article are non-exhaustive. There are many more blockchain applications that are worth scrutinizing. There are also a lot of threats associated with possible applications of the technology that must be understood. As a starting point, I highly recommend watching this documentary about the future of cities driven by blockchain and this talk about the future utopian and dystopian realities that blockchain might create.🤗
https://uxplanet.org/blockchain-and-its-applications-for-service-designers-7b6bfd9fec39
['Nare K.']
2021-06-25 12:36:56.956000+00:00
['Service Design', 'Innovation', 'Emerging Technology', 'Blockchain', 'Product Design']
225
Impact of Blockchain Technology in Health Care
Photo by Kendal on Unsplash Blockchain technology has the potential to transform health care, placing the patient at the center of the health care ecosystem and increasing the security, privacy, and interoperability of health data. This technology could provide a new model for health information exchanges (HIE) by making electronic medical records more efficient, disinter mediated, and secure. While it is not a panacea, this new, rapidly evolving field provides fertile ground for experimentation, investment, and proof-of-concept testing. ​ The promise of blockchain has widespread implications for stakeholders in the health care ecosystem. Capitalizing on this technology has the potential to connect fragmented systems to generate insights and to better assess the value of care. In the long term, a nationwide blockchain network for electronic medical records may improve efficiencies and support better health outcomes for patients. Bruce Broussard, president and CEO of Humana, posits blockchain will become the next big healthcare technology innovation, particularly as it relates to payments and payer contracts. For example, in a situation when a health plan and patient are dealing with a contract, the blockchain can automatically verify and authorize information and the contractual processes. “There is no more back-and-forth haggling with the health plan about what was paid, why it was paid or whether it should have been paid,” he wrote. “With transparency and automation, greater efficiencies will lead to lower administration costs, faster claims and less money wasted.” Another potential healthcare application is population health. Instead of relying on health information exchanges or other ways to aggregate data, organizations can eliminate the middleman and access patient databases on a large, population scale. “Spending time and resources verifying members’ trustworthiness (e.g., HIE, all-payer claims database, local EMRs) no longer makes savvy business sense. Blockchain will leap frog population health by providing trust where none exists for continuous access to patient records by directly linking information to clinical and financial outcomes,” reports CIO. With its ability to deflate the current spending bubble, protect patient data and improve the overall healthcare experience, blockchain may help ease the pain. The technology is already being used to do everything from securely encrypt patient data to manage the outbreak of harmful diseases. And at least one country is big on the potential of blockchain healthcare: Estonia. The size of Tennessee with the population of Maine, Estonia began using blockchain technology in 2012 to secure healthcare data and process transactions. Now all of the country’s healthcare billing is handled on a blockchain, 95% of health information is ledger-based and 99% of all prescription information is digital. SECURING PATIENT DATA Keeping our important medical data safe and secure is the most popular blockchain healthcare application at the moment, which isn’t surprising. Security is a major issue in the healthcare industry. Between 2009 and 2017, more than 176 million patient records were exposed in data breaches. The perpetrators stole credit card and banking information, as well as health and genomic testing records. Blockchain’s ability to keep an incorruptible, decentralized and transparent log of all patient data makes it a technology rife for security applications. Additionally, while blockchain is transparent it is also private, concealing the identity of any individual with complex and secure codes that can protect the sensitivity of medical data. The decentralized nature of the technology also allows patients, doctors and healthcare providers to share the same information quickly and safely. Checkout how these these companies are applying blockchain to healthcare security: BURSTIQ: BurstIQ’s platform helps healthcare companies safely and securely manage massive amounts of patient data. Its blockchain technology enables the safekeeping, sale, sharing or license of data while maintaining strict compliance with HIPAA rules. Blockchain application: The company uses blockchain to improve the way medical data is shared and used. Real-life impact: Because BurstIQ’s platform includes complete and up-to-date information about patients’ health and healthcare activity, it could help to root out abuse of opioids or other prescription drugs FACTOM: Factom creates products that help the healthcare industry securely store digital records on the company’s blockchain platform that’s accessible only by hospitals and healthcare administrators. Physical papers can be equipped with special Factom security chips that hold information about a patient and stored as private data that is accessible only by authorized people. Blockchain application: Factom employs blockchain technology to securely store digital health records. Real-life impact: In June of 2018, Factom got a grant of nearly $200,000 from the U.S. Department of Homeland Security to beta-test a platform aimed at integrating secure data from Border Patrol cameras and sensors in order to better understand the impacts of blockchain in “a realistic field environment.” BLOCKCHAIN MEDICAL RECORDS CAN STREAMLINE CARE AND PREVENT COSTLY MISTAKES Miscommunication between medical professional’s costs the healthcare industry a staggering $11 billion a year. The time consuming process of obtaining access to a patient’s medical records exhausts staff resources and delays patient care. Blockchain-based medical records offers a cure for these ills. The decentralized nature of the technology creates one ecosystem of patient data that can be quickly and efficiently referenced by doctors, hospitals, pharmacists and anyone else involved in treatment. In this way, the blockchain can lead to faster diagnoses and personalized care plans. These four companies are embracing the concept of blockchain medical records to create shared databases and personalized health plans. These companies are embracing the concept of blockchain medical records to create shared databases and personalized health plans. SIMPLYVITAL HEALTH SimplyVital Health is making its decentralized technology available to the healthcare industry. It’s Nexus Health platform is an open source database that allows healthcare providers, on a patient’s blockchain, to access pertinent information. Open access to important medical information helps healthcare professionals coordinate medical efforts more quickly than traditional methods. Blockchain application: SimplyVital uses blockchain to create an open source database so healthcare providers can access patient information and coordinate care. Real-life impact: SimplyVital recently partnered with genemics and precision medicine company Shivom to form a Global Healthcare Blockchain Alliance that employs blockchain security to protect DNA sequencing data. CORAL HEALTH RESEARCH & DISCOVERY Coral Health uses blockchain to accelerate the care process, automate administrative processes and improve health outcomes. By inserting patient information into distributed ledger technology, the company connects doctors, scientists, lab technicians and public health authorities quicker than ever. Coral Health also implements smart contracts between patients and healthcare professionals to ensure data and treatments are accurate. Blockchain application: Coral’s blockchain technology accelerates care, automates administrative processes and employs smart contracts between patients and doctors. Real-life impact: According to Coral’s chief strategy officer Jeremy Mullin, the company is looking into the possibility of using a blockchain and the Smart on FHIR protocol “to let patients track their own health files.” MEDICAL SUPPLY CHAIN MANAGEMENT AND DRUG TRACEABILITY/SAFETY How much do we really know about our medicine? Can we be sure it hasn’t been tampered with? Is it coming from a legitimate supplier? These questions are the primary concerns of the medical supply chain, or the link between the lab and the marketplace. Blockchain has serious implications for pharmaceutical supply chain management, and its decentralization virtually guarantees full transparency in the shipping process. Once a ledger for a drug is created, it will mark the point of origin (ie. a laboratory). The ledger will then continue to record data every step of the way, including who handled it and where it has been, until it reaches the consumer. The process can even monitor labor costs and waste emissions. Here are the companies using blockchain to rethink the medical supply chain. CHRONICLED Chronicled builds blockchain networks that demonstrate chain-of-custody. The networks help pharma companies make sure their medicines arrive efficiently, and they enable law enforcement to review any suspicious activity — like drug trafficking. In 2017, Chronicled created the Medi-ledger Project, a ledger system dedicated to the safety, privacy and efficiency of medical supply chains. Blockchain application: Chronicled’s blockchain network is used to ensure the safe arrival and detailed review of drug shipments. Real-life impact: According to the company, results from Chronicled’s recent Medi-Ledger Project prove that its blockchain-based system “is capable of acting as the interoperable system for the pharmaceutical supply chain” and “can meet the data privacy requirements of the pharmaceutical industry itself.” BLOCKPHARMA Blockpharma offers a solution to drug traceability and counterfeiting. By scanning the supply chain and verifying all points of shipment, the company’s app lets patients know if they are taking falsified medicines with the help of a blockchain-based SCM system, Blockpharma weeds out the 15% of all medicines in the world that are fake. Blockchain application: Through its app, the company’s blockchain-based system can help prevent patients from taking counterfeit medicines. BREAKTHROUGHS IN GENOMICS The potential of genomics to improve the future of human health, once a dream, is now a scientific and financial reality. In 2001, it cost $1 billion to process a human genome. Today it costs about $1,000, and companies like 23andMe and Ancestry.com are bringing DNA tests that unlock clues to our health and past to millions of homes. Blockchain is a perfect fit for this growing industry as it can safely house billions of genetic data points. It’s even become a marketplace where people can sell their encrypted genetic information to create a wider database, giving scientists access to valuable data faster than ever before. These three companies are using blockchain to further our understanding of the most basic building blocks of human life. NEBULA GENOMICS Nebula Genomics is using distributed ledger technology to eliminate unnecessary spending and middlemen in the genetic studying process. Pharmaceutical and biotech companies spend billions of dollars each year acquiring genetic data from third parties. Nebula Genomics is helping to build a giant genetic database by eliminating expensive middlemen and incentivizing users to safely sell their encrypted genetic data. Blockchain application: The company uses blockchain to streamline the study of genetics and lower costs. ENCRYPGEN The EncrypGen Gene-Chain is a blockchain-backed platform that facilitates the searching, sharing, storage, buying and selling of genetic information. The company protects its users’ privacy by allowing only other members to purchase the genetic information using safe, traceable DNA tokens. Member companies can use the genetic information to build upon their genetic knowledge and advance the industry. Blockchain application: The company’s blockchain platform makes it easier to search for, share, store and buy genetic information. Real-life impact: EncrypGen plans to expand its user profile to include self-reported medical and behavioral data. According to company co-founder and CEO Dr. David Koepsell, it’s also working on integrating a blockchain payment and auditing platform as well as forming partnerships with testing companies, analytics software developers and others.
https://medium.com/@jeet.mehta/impact-of-blockchain-technology-in-health-care-c7cbcf3503a8
['Jeet Mehta']
2020-07-03 09:12:54.627000+00:00
['Healthcare Technology', 'Smart Contracts', 'Contracts', 'Healthcare', 'Blockchain']
226
Artificial Intelligence –The Tech Superhero Of This Era
Different business verticals found sudden interest in AI and why not! In this tech era, AI eventually unfolding its high-performance abilities and features, which could salvage life. Over the years, AI enabled automation in various sectors including business, banking, agriculture, marketing, and others. Practically, AI has become a push-up element for all the industries across the globe. The recent trend in AI is somewhat different from how it worked so far. Researchers have found new pathways where AI emerged to be a lifesaver for humans. How you see AI and robotics enslaving humans in movies that is far beyond reality for now! Today, AI applications are featured to make human lives safer. Here are some examples of AI applications that proved to be lifesaver for humans. 1. Autonomous vehicles to impede accidents According to a published report, around 1.35 million people around the world lose their lives in roadway accidents. Although we have hi-tech advanced vehicles, still human lives couldn’t be saved from roadway crashes. The most vulnerable ones are bicyclists, pedestrian, and motorcyclists. It is too late to sensitize people for following road rules or minimizing the use of vehicles. Keeping this in mind, the automotive industry is all set to introduce AI-automated vehicles. This could prove to be a revolution for the mechanical industry. These vehicles would have an in-built application to enable computer vision which could detect and prevent accidents on road. One such application is !iMPORTANT which is designed to minimize the risk of collisions with different vehicles like cars, trucks, buses, construction equipment, and autonomous vehicles. 2. Health applications to detect medical conditions The world recently is suffering from a pandemic situation that is taking millions of lives around the world. Human-to-human contact is highly unsafe these days. The risk is more for frontline medical staffs and doctors. While delivering treatment, there’s always a risk of contracting the disease despite wearing PPE gears. Medical institutions are desperately in search of an AI-automated system to deliver good healthcare facilities and reduce the need of human hand. Catalyst.ai and healthcare.ai designed by Health Catalyst are some of the lifesaving applications developed using AI. These applications had been developed on machine learning technology that can specifically identify patients with a great risk of readmission and provide clinical guidelines to address problems. 3. Collated data for drug production Subsequent to the detection of health issues comes the treatment procedure. Proper medical attention and medicines are vital for curing any disease. Doctors need to be choosy while prescribing any drug to a patient as there are intervening side effects that can lead to other health risks. To automate the process, Okwin is designing AI-powered pharmaceutical solutions using machine learning algorithm. The models have been designed for advanced treatment, prediction of disease evolution, and improve the way drugs are produced. Okwin receives data from hospital partners to find ways of helping patients improve drugs more quickly with fewer side effects. Standing today with so many life risks, AI is emerging as a superhero in the technical world. If you want to bring innovation in your company, consider launching an AI-powered software that can save human lives. This is the best way to expand your business in today’s date. You can outsource to the best mobile app development company in India for better results.
https://medium.com/@magicmindtechnologies/artificial-intelligence-the-tech-superhero-of-this-era-a6d3aa91ac96
['Magicmind Technologies']
2020-09-02 14:39:50.062000+00:00
['Artificial Intelligence', 'Science', 'Technology', 'Technology News', 'AI']
227
Disruption: Friend or Foe?
Why disruption can happen to you and what you can do about it. Soup In a Can In 1869 an American greengrocer had an idea to start condensing his vegetables and manufacturing soup in a can. That greengrocers name was Joseph Campbell and the company he founded was the Campbell Soup Company. Today, Campbells is a household name sold around the world, and amazingly 150 years later they are still selling soup in a can. The 501’s In 1873, two Jewish immigrants came together to patent the use of copper rivets for reinforcing points of strain in denim pants. These reinforced denim pants became the famous Levi 501 Jeans. After selling their first pair at the end of the 19th century, Levi Strauss are still selling the 501’s and they are still just as popular as ever. Tech Companies Are Hard Traditionally businesses have been able to manufacture a product and sell that same product year after year. Tech companies are very different. When a tech company builds a product, they know that in 5 years they wont be selling that same product, and even within 2–3 years that product is likely to have changed. With the pace of technology, tech companies know at the start that they will need to innovate their current product, or else it will become obsolete. The key difference between traditional companies and modern tech companies is that where the output of a soup company is soup, and the output of a denim company is denim, the output of a tech company is innovation. The New World Renowned entrepreneur and VC Marc Andreessen famously said: “Software is Eating the World” The implication of this is that all companies are fast becoming tech companies. If so, then every company needs to start focusing on innovation or otherwise risk becoming irrelevant and obsolete. But how do companies that have been so successful for so long focusing on traditional outputs suddenly shift their focus? Innovation is a Verb Part of the answer lies in how traditional companies view innovation. Too often innovation is a noun, a magical product created by a special type of genius that suddenly appears out of nowhere. The truth is actually the opposite. Innovation is a verb. It is a repeatable process of structured experimentation, based around a fixed set of principles that guides the way we discover and solve problems. It can be done by anyone and should be a continuous cycle focused on learning as much as you can as fast possible. Seeds Don’t Grow in the Dark In the same way that a seed can’t grow into a tree if it’s not given light, water and the right soil, an idea can’t grow into a viable business if it’s not incubated and supported by the right ecosystem. Traditional companies have been built on structures and processes to manage operating at scale; hierarchical, siloed and risk averse. These same structures will kill innovation. They are sure fire ways to prevent disruptive ideas from being able to develop and to ensure that the entrepreneurs in the company make a quick exit. As the pace of change gets quicker and the future becomes more and more uncertain, traditional companies need to start making sure that they have embedded innovation well and truly into their organisation. It will be the difference between benefitting from the opportunity that disruption can offer, or having disruption happen to them when it’s already too late. About the author: Josh is an experienced product and innovation professional having worked with clients across the Finance, Health, Energy, and Retail sectors. By helping clients follow an evidence based approach focused on the end customer, he is able to embed innovation across their business and uncover new opportunities to disrupt their industry. He is a passionate believer in the power of technology and investment to enable large scale social impact and economic empowerment.
https://joshwermut.medium.com/output-as-an-innovation-dd0fad33d1fb
['Josh Wermut']
2018-06-19 03:47:24.715000+00:00
['Digital Transformation', 'Innovation', 'Technology']
228
Six Ways that Microsoft Enterprise Tools Can Help Managers Streamline Remote Work
Successfully transitioning your team to a remote-first environment is not for the faint of heart. Not only do you need to make sure your employees are logistically able to complete their daily tasks from any location, but at a basic level, you also need to make sure team members are able to collaborate, your data is secure, and employees remain productive with the capacity to innovate on a daily basis. In order to create an efficient and effective remote work environment within your organization, you first need to find the right tools. With the high demand for digital solutions in the current COVID environment, there are several companies vying for your attention when you’re looking for the right tools to transition your team to remote work. At Camber, we work across multiple products, and the Microsoft suite is frequently requested for transitioning to a distributed work environment. Just recently, we utilized Microsoft Enterprise Services through our work with AdventHealth to redistribute a large healthcare workforce to handle COVID-19 surges within a short period of time. According to one of our internal Microsoft Enterprise Solutions experts, one of the advantages of using the MS ecosystem is that it takes approximately 70% less time to build products in comparison to conventional programming. This accelerated time to deployment can be critical to the solution’s success, particularly in the current environment. Our work with AdventHealth is just one example of how Microsoft Enterprise Solutions give clients the tools that they need to foster organizational efficiencies in a remote-first environment. Here are a few ways Microsoft Enterprise tools can help you manage a successful remote workforce. Foster Collaboration Microsoft Teams is an essential hub for remote teamwork, satisfying core needs of your team. This tool provides a comprehensive platform where teams can chat, meet virtually, collaborate on content, and create and integrate apps tied to core workflows. Microsoft Teams is a great tool to help organizations stay connected in unique and creative ways. In comparison to competitive video chat platforms, the Together Mode available on Microsoft Teams provides the unique capacity to place participants in a shared setting, such as a conference room, classroom, or coffee shop, where participants on the call appear to be in virtual seats. This diversity of digital experiences may help meeting participants to avoid fatigue and more closely simulates an in-person meeting experience. There are several other unique features within Microsoft Teams, such as the ability to capture meeting recaps following the virtual meeting, the capacity to split participants into smaller groups through the use of breakout rooms, and custom layouts which can make presentations look and feel more natural to the audience. Microsoft Teams also offers the Microsoft Teams Power Platform which allows organizational leaders to create apps catered towards the organization’s needs through the use of low-code tools. In addition to providing the capacity for employees to build their own apps, Microsoft Teams also allows companies to integrate a wide range of third-party apps to streamline business processes and facilitate collaboration within the Teams platform. Improve Employee Efficiency with Data With the shift to remote work, you can’t rely on workplace observation and proximity to ensure productivity in the work environment. Rather, you need to carefully craft remote work policies to make sure employees are getting their work done without invading their privacy. The tricky part is that if work from home policies aren’t managed properly, they can go horrifyingly wrong. Microsoft Workplace Analytics and Productivity Score are two analytics tools offered through Microsoft Enterprise that can help team leaders to monitor employee behavior in the remote work environment in a far less creepy way. These tools can provide managers with insights into how well team members are working together and how effective meetings are. With employees being geographically dispersed, it is critical for managers to have clear insights into how employees are collaborating since it is the behavior and work ethic of these employees that determines the shape of the business moving forward. While there are a wide range of strategically beneficial reports within Workplace Analytics, a few include the capacity to conduct an analysis of top performers and several reports to assess how employees are spending their time. These reporting capabilities provide companies with the insights that they need to improve how employees work, how they spend their time, and how to adjust internal operations accordingly. Through analyzing which individuals perform best internally and pairing this with an analysis of their respective behaviors, managers can figure out the relationships between workplace behaviors and results. This gives them the information that they need to alter internal processes accordingly to build stronger teams. Conducting analyses on how employees spend their time can give managers insights into the nuances of internal operations to make sure that employees are spending their time on tasks that will directly impact business outcomes. Productivity Score, which is integrated into Microsoft Endpoint Manager, is another helpful analytics tool that provides insights into performance at the intersection of the employee experience and the technology experience. The tool can provide suggested improvements to let leaders know which systems need to be updated to give employees the technology tools they need in order to do high quality work. Move Your Team onto the Cloud Moving your team’s internal processes onto the cloud is optimal, as it makes it easier for teams to collaborate. This increased potential to work together more easily can increase organizational efficiency and productivity. Shifting to the cloud is particularly appealing in a remote environment, as it provides the opportunity for employees to work together across several different devices from any location. For many companies, moving to the cloud also gives leaders the flexibility to store and manage data effectively. Cloud solutions can provide your organization with the flexibility to scale up or scale back as necessary to meet your organization’s changing needs. Microsoft Sharepoint is a helpful tool to streamline your organization’s workflow through providing the capacity to share internal files and resources. Sharepoint can help your team to quickly communicate and share key learnings, inform key internal stakeholders, and simplify complex organizational workflows. Deploy Custom Apps and Internal Products Teams don’t operate within a static vacuum. They operate within a rapidly changing business environment. In order to proactively reassess business strategies and priorities and adapt internal processes in response to changing needs, it is critical to have the essential technology solutions at your fingertips to quickly deploy custom apps and internal products. Microsoft Azure is a great platform that enables teams to build solutions and deploy applications from any location. Using Microsoft Azure can help your development team to ship innovations more quickly while protecting against threats to your sensitive data. When utilizing this solution, your team will be equipped with the resources necessary to migrate to Azure. This proven cloud adoption framework will also allow you to leverage enterprise-scale analytics to generate real-time insights. Development teams frequently leverage Azure Devops for planning, to manage backlogs and repositories, for release management, and for hosting apps, Azure functions, and databases. The MS Power Platform, composed of Power Apps, Power Automate, Power BI, and Power Virtual Agents, is another key platform that can help you to accelerate the time it takes to build custom applications. Power Apps is a low-code app building tool that can help you create custom solutions through utilizing prebuilt templates and a drag and drop functionality, empowering non-technical employees in building apps, automated solutions, and reports. Leveraging low-code platforms like Microsoft Power Apps can help you to improve organizational inefficiencies, foster rapid innovation, and quickly capitalize on new market opportunities that arise. One major benefit of leveraging low-code platforms is that it can give non-technical team members the capacity to build sophisticated technology solutions, freeing up the time of your tech team to resolve more challenging issues. Not only does Microsoft Power Apps give you the tools you need to quickly build new apps, but it also allows you to update and alter them as needed. When used in conjunction with Power Automate, Power BI, and Power Virtual Agents, you can automate several business processes. Power Automate allows employees to automate routine business tasks so that they can allocate their time to more advanced strategic priorities. Power BI is a useful tool to obtain insights from your data and visualize key trends. Power Virtual Agents is a platform that allows you to quickly and easily develop chatbots to streamline and improve the scalability of core business processes. Simplify IT Practices Through leveraging Microsoft Endpoint Manager, IT professionals are provided with the tools they need to make remote work secure and logistically possible for employees across the organization. This suite of tools can help your business to remain resilient in a challenging and rapidly changing business environment. Microsoft Endpoint Manager is particularly critical in a remote work environment, as it allows employees to set up their company devices remotely within a few hours rather than having to go into a physical office space to get their technology solutions set up. Additionally, employees are increasingly working off of their own technological devices to do work, such as their mobile phones. This suite of Microsoft tools allows employers to secure work files on employees’ mobile and desktop devices so that there is a lower likelihood of data leaks. Strategically leveraging these tools can increase employers’ confidence that sensitive information and internal data is safe and secure. Additionally, through utilizing the Microsoft 365 monitoring suite of tools with device health and remediation capabilities provided, your internal IT team will be able to focus more on delivering value to the organization and less on troubleshooting issues tied to existing applications. In the current business environment, it is likely that your internal IT team members are stretched incredibly thin, often forced to make the decision between resolving new issues that arise and progressing new initiatives. Through leveraging the right Microsoft enterprise tools, you can achieve both simultaneously. Integrating these tools into your workflow can empower non-IT team members to troubleshoot more independently which will provide your IT team with the capacity to focus on more technically intense and higher priority work. Maintain Security and Compliance One of the biggest challenges that companies face when transitioning to a cloud-based environment is to prevent data leaks and to ensure that all information remains secure. The great thing about Microsoft enterprise solutions is that the suite of services nicely complement one another to simultaneously provide your organization with the flexibility that you need to innovate and the security solutions that you need to do so without fear that private information will be compromised. Microsoft Defender provides a unified security solution that spans both Microsoft 365 and Microsoft Azure. This tool helps to detect, prevent, and respond to any imminent threats that arise across existing applications and cloud platforms. Microsoft Defender provides real-time protection through utilizing file and process behavior monitoring tools and cloud-delivered protection. The antivirus software is consistently updated to make sure your security solutions are up to date. In addition to maintaining tight security through Microsoft Defender, Microsoft’s Compliance Manager can help your organization to ensure they are meeting all essential regulations at the industry and regional levels. With the compliance environment changing on a daily basis, it can be difficult to keep up with new requirements. Through simplifying the compliance and risk reduction process, this tool helps you to do just that. Within the Compliance Manager, you will have access to a dashboard that showcases your compliance score and provides recommendations as to what steps you need to take to meet the components of your compliance solution that require your attention. Key Takeaways With the rapid transition to remote work, it is important that leaders put the right tools and strategies in place to set teams up for success when working remotely. While organizational needs differ, at a minimum, most companies need to provide employees with the capacity to share files, organize notes, communicate in real time, and digitally build a company culture. Given that Microsoft Enterprise Solutions offers a range of useful tools for different aspects of remote work, it is important to take a step back and try to understand what solutions you really need to improve the quality of the remote work environment that you’ve created. Depending on your organizational goals and internal processes, certain tools may be more relevant and critical than others so it is important to try to integrate the tools that will be a strong fit for your organizational needs. If you are looking for ways to better leverage your Microsoft enterprise tools, we are eager to work with you and to provide a free consultation. Our Camber Creative Microsoft experts are here to help with every phase of your project from problem definition to product development and ongoing managed services. We can assist with custom app development, cloud migrations, expedited deployments and assessments of your current configuration. We look forward to collaborating to help you get the highest possible return on your technology investments!
https://blog.cmbr.co/six-ways-that-microsoft-enterprise-tools-can-help-managers-streamline-remote-work-2f4d15d12dbd
['Jenna Rodrigues']
2020-12-14 22:06:56.433000+00:00
['Enterprise Technology', 'Microsoft', 'Microsoft Teams', 'Microsoft Azure', 'Remote Work']
229
Algorithmic Architecture for BMS
Battery Algorithms Shivaram N V, Hari Vasudevan A battery management system or BMS is core to the functionality of an EV. While much has been documented, written and talked about the mechanical, electrical and software architecture of a BMS, not much has been written about the algorithmic architecture of the BMS. Of-course a relevant question that could be asked at this point is - is the time spent in creating an algorithmic platform worth it? Here at Ather, developing and deploying such an architecture has helped us immensely in creating a robust and reusable platform. In this post we are going to detail the basic algorithmic structure of the Ather BMS and how it has helped accommodate the many different use cases of the 450X. Algorithm Fragmentation A cursory search of BMS algorithms often yields a number of various algorithms, dealing with charging, protection and discharge. Take charging for instance, even the very basic CC-CV charging flow that is common to most chargers has a number of caveats related to temperature, voltage and charging currents. What makes this trickier is that a variety of such chargers are encountered by the vehicle. For eg, fast chargers have different charging current capacities, temperature limits and protections when compared with a slow home charger. In addition to this the EV will most likely encounter multiple generations of these chargers each with its own rules. In fact since Ather continually upgrades the public charging infrastructure to improve the access and reliability of the fast charging network, it causes much (happy) consternation internally as now the algorithms will need to track these changes also. Similarly, discharge of a battery pack also needs to be handled by the BMS algorithms. Discharge control of the battery pack needs to ensure that we are able to maintain the right cell temperatures, avoid drops in individual cell voltages and gracefully transition to a lower power mode at low States of Charge (SOC). The BMS also needs to ensure consistency in performance even as the battery pack ages as part of natural degradation in cell capacity. Last but not least, the BMS also needs to compute the right SOC for display to the user. Optimality & Robustness The most commonly known charging algorithm like the CC-CV belongs to a class of heuristic algorithms that gets the job done but not necessarily in the most optimal way. In the case of charging, we’re chasing minimum charging time while adhering to battery safety constraints. Similarly take the case of SOC - there are quite a few heuristic methodologies to estimate the SOC to a “decent” extent. But to maximise the usable energy consumption from a battery, it is necessary to have an algorithm that is not only accurate but also robust to a whole bunch of noise factors that a Battery Management System is exposed to. Accommodating all of the above functionality with a multiple different algorithms paired up with the corresponding hardware or vehicle platform or scenario will eventually lead to fragmentation of the code. Finally the maintainability of the code-base will also take a hit as verification and validation start to consume increased time and effort. To get around these issues on the 450X we have developed a scalable BMS algorithm architecture (Figure 1 below) that attempts to provide a framework wherein all the above described optimality criteria and variations can be accommodated. Figure 1: Battery Charge & Discharge Vehicle Application Architecture SOC & The Kalman Filter Literature on the SOC estimation of a cell or battery pack is available in plenty. In fact a cursory glance through literature on this leads to various formulations of the Kalman Filter for the estimation of SOC. Literature is peppered with various Kalman Filter formulations for SOC such as the Extended Kalman Filter (EKF), or the Sigma Point Kalman Filter (SPKF) and further varieties of these with each providing added benefits for added complexity. However often literature on SOC estimation is the end in itself and further development of the Kalman filter towards an LQG (Linear Quadratic Gaussian) controller for discharge/charge regulation is overlooked. Here at Ather, we have developed a framework that can further put the state estimate obtained as part of the Kalman filter structure to create a flexible and powerful framework to accommodate most charging and discharge functions of the BMS. Cell Dynamics & State-Space formulation A number of cell models are generally used for formulation of the Kalman Filter. In Figure 1 below we present a simplified lumped parameter model that is in widespread use. A description of these parameters is given in 1. It is important to understand that all the RC parameters in this model are functions of SOC. Estimation of the RC parameters is a topic in itself and will be the contents of a subsequent blog post. Additionally we would like to point out that our model is a second order model, whereas higher model orders may be used for higher accuracies with the trade off of complexity. Given the above model and parameters, the equations that define this system are as shown below. The voltages V₁ and V₂ are described by the dynamic equations 4, 5, while SOC dynamics by 6. The OC voltage Voc is linearized (in equation 2) and re-written to cast it in a form suitable for LTI state-space analysis. This assumption is valid as long as the SOC does not change very fast, or in other words the rate of change in SOC is an order of magnitude different from V₁ and V₂ dynamics, which thankfully is the case. Figure 2: RC parameter Model Using these above equations we are now able to formulate a state-space model for an individual cell. This formulation is described in the following equations, with the state equation 7 and output equation 8 formulated as shown. State Estimators In the prior section we have a model for cell dynamics, we now explain the basis for formulating a state estimator. Of-course the way we have described the cell dynamics model is by no means the only way to describe the dynamics. Ours is a current input and voltage output system, and there exist equally valid models with voltage in and current out. Furthermore, we have not included the Voc hysteresis in this model, which for certain cell chemistries can be dominant. All state estimators function via a ‘plant’ model that mimics the dynamics of the system under observation and a ‘correction’ term which is usually a measurable output of the system. Eventually the plant model and the correction term are fused together to arrive at a state estimate that is more accurate than either relying on the model or the sensor alone. The beauty of the Kalman filter is of-course that it estimates the least squares optimal estimate of the state even under varying noise conditions. A basic state estimator and its formulation is described below in equations 9 and 10. In state estimator design of course the main question is how we design the correction gain ‘L’. Whether we are designing a Luenberger or a Kalman Filter, there’s tons and tons of literature on how best to pick ‘L’. We would like to refer the reader of this blog to the excellent webpage of Dr. Gregory L. Plett, Professor, University of Colorado. A thorough description of Kalman Filter equations is described in his course notes on Battery management and Control. Charge & Discharge Controllers At this point if our intent was to only describe a better way to estimate the State of Charge (SOC), we would be done. However, while the state estimator is a great way of arriving at the true SOC of the cell, it can also be put to other uses namely in the charge and discharge control of the battery pack. Charging Controllers The most widely used charging controller for EV is the CC-CV controller, meaning that the charger charges at a constant current until it hits a terminal cell voltage and then transitions to a CV charging mode, which applies a constant voltage until the cell voltage rises to equalize this voltage and the charging current drops to zero. This transition from CC to CV on chargers is what mostly causes much of the grief in creating a scalable architecture.
https://blog.atherenergy.com/algorithmic-architecture-for-bms-d1fd9f5eb1b
['Easha Ranade']
2021-07-14 09:59:07.912000+00:00
['Technology', 'Algorithms', 'Intelligence', 'Battery', 'Charging']
230
RandomCrypto’s new BTC mining modeler visualizes the unprofitability of major ASICs
Lowering Mining’s Barriers of Entry Even as the mining ecosystem has professionalized with the rise of industrial-scale hardware deployments, crypto tools and services have barely changed — they remain clunky and simplistic, often grossly overpromising returns, as with the GMO B2. “Calculating long-term mining ROI isn’t a straightforward extrapolation of a constant daily reward, which is unfortunately how almost all current calculators work. Mining difficulty continuously varies based on network hashrate, which is emergent from complex economics and network dynamics,” says CTO Ethan Bian. Independently verifying manufacturer claims is also challenging, since existing calculators are often sponsored by manufacturers and hosting services themselves, who may have little incentive to make more truthful projections. As networks have matured and the amount of resources poured into mining has continued to increase, this is increasingly becoming a problem, as Bian explains: “Mining has become a high-stakes game involving billions of dollars (USD) of value, and the more difficulty rises, the harder it is for smaller-scale miners to compete; not a good sign for a healthy, decentralized system. The misleading data from most calculators has perpetuated this environment — while large-scale operators can rely on internal resources to model ROI accurately, this is an expensive and slow process which disincentives small-scale miners from participating. Our modeling tool is a crucial step in the direction of information parity, allowing small-scale miners to make decisions the same way bigger players do.” RandomCrypto’s Commitment to Cryptocurrency Network Health Cryptocurrencies rely on distributed verification to maintain trust, and unequal access to resources can exacerbate unbalanced network ownership. RandomCrypto believes the development of professional, public, and verifiable tools is essential for keeping mining democratic and maintaining network health. Calc is the first of many tools RandomCrypto plans to launch to bring clarity and trust to cryptocurrency. It can be used for free today at calc.randomcrypto.org.
https://medium.com/randomcrypto/randomcryptos-new-btc-mining-modeler-visualizes-the-unprofitability-of-major-asics-22302070823a
[]
2018-08-03 07:23:13.984000+00:00
['Technology', 'Modeling', 'Mining', 'Cryptocurrency', 'Bitcoin']
231
Mythbusting 5G
“All lies and jest, still the man hears what he wants to hear and disregards the rest.” — Paul Simon The wildest 5G network conspiracies circulate the internet. I will not take your valuable time by addressing them. Most news articles, expert reviews focus on a specific area of the 5G society. The idea is to persuade you of their vision of the 5G world. If you take time to do a few Google searches, you will know that science isn’t conclusive on the potential risks of 5G. To give you an overview of the wide variety of ideas on what 5G implies for humanity, I have grouped several expert views from different areas, under three seemingly ungrounded statements that circulate the internet. This blogpost has to be read as a teaser for self-inquiry about the 5G topic. I am well aware that its difficult to say anything meaningful if I include multiple perspectives. But in a globalized world, someone has to put the puzzle together. In between the myth-busting, I will challenge you to reflect on your position. Because the implementation of the 5G network will transform the way, we think as a species about what it means to be human in a digital world. Eventually, you will need to position yourself in this ongoing debate. So I recommend you — educate yourself. Myth#1. — 5G radiation is dangerous for the body Founding Father Bill.P. Curry, a consultant, and physicist, published in 2000 this graph to show that tissue damage increases with the rising frequency of radio waves. Sadly, he failed to account for the shielding effect of human skin. According to science journalist and senior writer at the New York Times, William J. Broad, fearmongering websites picked up Curry’s research and assumed the correlation between cancer and cellphones. Besides all the conspiracies that sprung, even some lawsuits used this graph to make insubstantial claims. Bouncing Off To elaborate, why 5G isn’t so dangerous for the body, we have to make a stop at physics to explain frequency. If you jump on a trampoline, you get bounced back up. Your mass is denser than the trampoline mat. Else you would fall on the ground. Now imagine that your skin is the trampoline’s mat, and the sunlight is the jumper. When the light of the sun touches your arms skin, it bounces back into your eyes. If the light would get absorbed, you couldn’t see your body parts. Hence no sunlight can reach through your skin to reach the bones. Take a good look at the next image. According to David Robert Grimes, a cancer researcher and physicist, we should not fall prey to 5G scaremongering. To put 5g in perspective compared to ionizing radiation, which is detrimental to our health, capable of damaging DNA and killing; The weakest visible light is more than 17,000 times more energetic than the highest-energy 5G photon possible. In other words, 5G rays are not ionizing, so rationalists would argue that 5G does not pose any risks. But this technology is still relatively new. We have no idea about the long term physical effects of exposing skin to frequencies of the 4G and 5G networks. Mental Aftermath Whereas most activists focus on potential health and environmental risks, I think people should focus on the information overload caused by the internet. With faster upload speeds of the 5G and the ongoing globalization process, I assume people will be bombarded with even more information very day. Nowadays, I find it increasingly difficult to distinguish between opinion, fact, and lies by omission. Does humanity drown in the abundance of knowledge? Balance I believe technology isn’t bad or good. I speculate that without the 4G network, we wouldn’t have had the Arabic Spring, Me Too, and Black Lives Matter movement. Also, digital nomadism, online freelancing, dropshipping, growth hacking, and digital marketing wouldn’t have evolved. On average, the world would be worse off. Of course, there are pitfalls of this digitalization process, like the-always-available -mentality, FOMO, cyberbullying, burnouts, fake news, and remote hacking. Light doesn’t exist without its shade and vice versa. Likewise, the 5G network will have fantastic potential and dark undertones. Conclusion The myth that 5G Radiation is dangerous for health is partially plausible. According to the common sense of science, the potential physical risk of 5G is often exaggerated. However, we lack research on the long term effects of low-frequency radiation exposure to the skin. I don’t have a clear image of how a 5G civilization operates. Thus I can only hypothesize that humanity can’t cope with an information overload. Photo by Morning Brew on Unsplash Myth #2–5G makes 4G obsolete. Network features The efficiency of a network depends on speed, latency, and coverage. Speed is the number of milliseconds it takes to download information from a server to your cellphone. On average, the download speed is 10 to 50 Mbps(megabits per second) on a 4G network. Netflix recommends 25 Mbps for Ulta HD, but it only needs 5 Mbps for HD streaming. 5G aims to hit 50Mbps as an average minimum. If you use your phone network to Chromecast a Netflix Or Amazone Prime tv show, you know that 4G is already excellent at streaming HD. The internet of things The internet of things is a network where every object communicates with each other for optimal efficiency. In our current 4G world, the latency or upload speed of a network is 50 milliseconds. 5G has the potential to drop that latency to 1 ms. For context, the brain processes images in at least 10 ms. To realize the world of the internet of things, which includes self-driving traffic, cloud gaming, and immersive virtual reality — low latency is vital. Coverage The price of a high latency network is that the bandwidth is much shorter than the 4g network. In layman terms, every hundred meters, you need a new antenna. The signal doesn’t travel far. According to RCR Wireless News, South Korean KT, provider of the network of the 2018 Winter Olympic games in Peyeong Chang, needed 46 antennas for the 4G infrastructure and 212 for the 5G network. Disadvantage Because the frequency of the 5G network is slightly higher compared to 4G, it cannot penetrate solid objects easily, like cars, trees, and walls. One valid argument against the 5G network posed by a British MP, Geraint Davies: It is ridiculous to clear away the trees for better efficiency of the 5G network, when you have ambitions for zero carbon emissions to slow down climate change. Integration Because both networks have their strengths, the 5G network functions as an extension of the already existing 4G infrastructure. Large 4G antennae are connected to smaller 5G antenna’s to get the best of both worlds. Conclusion The myth that 5G makes 4G obsolete is busted. Both the 4G and 5G networks will continue to exist as they compliment each other. Myth#3–5G makes internet surveillance easier Before we dive into politics, there are three different aspects of internet security: privacy laws, encryption of data, vulnerability to hacking. Privacy Regulation Most privacy legislature is normative. The basic idea is protection against corporate interests to abuse the insights after the analysis of your data. But some laws give governments the ability to spy on other countries. I find it hypocritical when you agree as a society that spying on your civilians is unethical. Still, under the motto of self-defense of the nation, a government may spy on foreign people. China According to Emmanuel Pernot-Leplay, a data consultant at Deloitte with a Ph.D. in comparative law, China data privacy laws follow allong the lines of the European GDPR, and diverges from the U.S., which does not afford the same level of protection and for example allows internet providers to sell users’ data without their consent to this purpose. Although the Chinese are strengthening the protection of the digital identity against private entities and they also increase of government’s access to personal data, as there is still no significant privacy protection against government intrusion. China’s 2017 National Intelligence Law, which says organizations must “support, co-operate with and collaborate in national intelligence work,” means that Beijing could force any company like Huawei to do its bidding. USA Laura Hautala, a staff reporter at Cnet, thinks that the PRISM and Upstream spy programs were renewed when Section 702 of the FISA Amendments Ac passed both houses of Congress and President Donald Trump in 2018. Section 702 permits the Attorney General and the Director of National Intelligence to jointly authorize the targeting of persons reasonably believed to be located outside the United States but is limited to targeting non-U.S. persons. The PRISM and Upstream programs exist to collect online communications of foreigners outside the US. Prism takes the communications directly from internet services like email providers and video chat programs, and Upstream taps into the infrastructure of the internet to pull in the communications while they’re in transit. Both China and the USA like to point fingers to the lack of privacy. Both are guilty of the crime exploiting digital identities by wiretapping the internet. Encryption Forbes Technology Council Member, Andy Pury, argues that the 5G network 256 bits roaming encryption will be better than the current standard of 128 bits of the 4G network. Blurring the lines One concern is that the boundary between the radio access network (RAN) and the network core will start to vanish as enhanced computing power moves closer to the network edge. The core is the network’s brain, which controls authentication, encryption to sensitive customer data. Alternatively, the RAN is the network’s arms and legs. It is the place at the network’s outer edge, where it receives signals from smartphones and other devices and transmits them back to the core, using cell phone towers or base stations. Source = Oreilly According to Heavy Reading, the research division of B2B digital media platform Light Reading “to deliver services over a 5G RAN, requires a system architecture and core network.” Hence any mobile technology that doesn’t include this separation would not be compatible with our internet network. Hacker proof Rest assured that most telecommunications operators are advised to use multiple vendors for encryption. A friend of mine who is a cloud engineer ensured me that most companies minimize the danger of a hack by storing personal data in multiple cloud databases — each uses a different entree key and another authentication system. Even if a hacker collective would succeed, they can’t link the dataset to the users or vise versa. Vulnerability There are a lot of different perspectives on the vulnerability of the 5G network. So I will address three controversial views. Loopholes Senator Mark Warner says: ” Any supposedly safe Chinese product is one firmware update away from being an insecure Chinese product. Foreign companies seem more challenging to trust because they don’t get the same annual audits as the tech giants Facebook, Google, and Apple. Worst case scenario The government of China could force Huawei to make a small backdoor in their next framework update and use it to steal sensitive data, hiding the breach in the plain sight, buried under thousands of lines of code. Then before some ethical hacker group can spot the hole in security, they remove the breach in the next update. In Niclas Weaver’s words, a researcher at the international computer science institute — sabotage, can be very subtle. The race Yuval Noah Harari, the writer of the bestseller Homo Sapiens, says in the Tim Ferris podcast, that Google is racing to crack the code on how to predict human behavior consistently. The 5G network would be a goldmine for all growth hackers, digital marketers, psychologists, governments, and machine learning companies because they could better analyze large quantities of data. Self-reflection Some disturbing questions to ask yourself. Are we the sum of all our life experiences? If so, is life experience quantifiable? If you agree, are we then our data?
https://medium.com/towards-artificial-intelligence/mythbusting-5g-818338afd55
['Jasper Ruijs']
2020-07-27 08:17:02.154000+00:00
['5g', 'Opinion Piece', 'Internet of Things', '5g Technology', 'Futurism']
232
Bringing The Necessity To The IT Industry
Bringing The Necessity To The IT Industry Who Necess-IT is and what they do. Necess-IT provides high quality IT work within the Tampa area and beyond. While Covid-19 had many businesses shutting down in 2020, Necess-IT stepped up to the plate to provide the necessary equipment and upgrades required in many businesses to adapt to the changing landscape that the entire world needed to quickly adapt to. This consists of growth at the end of 2020 within Josh’s company that was unexpected but highly welcomed. Check out my video on data findings over the last year. I spoke with Josh who is the CEO of the company and he stated that they went from working from their home office to a fully staffed warehouse and expanded from 4 to 20 technicians who work on the field all over Florida in a matter of a year. He started the business in 2014 with his partner Kevin who is part owner. They worked out of the trunk of their car (called trunk slammers in the industry) endlessly for years to build up their team of skilled employees and gaining the necessary knowledge on the IT industry. Fast forward to 2021 and they work with big name brands such as Coca-Cola, IHG Hotels and Manor Care nursing homes. Providing everything from low voltage cabling projects, fiber optic site surveys and service calls daily to repair cable that provide data, phone connections that are not working properly and complex camera system setups at retail and office spaces. During the height of Covid-19 many businesses had to adapt, this included the addition of new ways to pay in retail such as NFC pucks for contactless payments and upgrading entire internet provider networks in nursing homes to provide better flow of data throughout local networks within the businesses so their daily employee tasks can resume as usual. For more information on Necess-IT and public relations you can also visit my website. The technology industry during the pandemic shifted in a major way. Entire companies went from working in offices to all working remotely and shutting their office spaces almost entirely. Below are some articles that I found interesting regarding the tech industry and Covid-19 and the changes that took place over the span of a year and beyond. Necess-IT worked hard to assist local businesses upgrade their network workload that it was going to have to now produce since most employees were starting to work from home and offices needed upgrades anticipating their return after the pandemic. https://www.pwc.com/us/en/library/covid-19/coronavirus-technology-impact.html In conclusion, Necess-IT has been a great example of a business that became a need for many when times got tough. They see no signs of slowing down and expanding their services and quality work for years to come. For more public relations insights and information you can visit my Facebook page.
https://medium.com/@gregoriocreates/bringing-the-necessity-to-the-it-industry-26db952ae4e2
['Gregorio Feliciano']
2021-12-20 02:07:13.220000+00:00
['It', 'Technology', 'Mastersprogram', 'Fullsailuniversity', 'Necessit']
233
II Cold War. The geopolitical face of technology.
Technology has been always a hot topic for any country but, especially, to the United States of America. This 21stcentury brought us a lot of things that we were expecting and other tech advances that we haven’t even imagined in our finest dreams. Anyone remembers the sound of a 56kbps router turning on? Well, if you aren’t from this generation you can listen to it. What happens today? We got bored if Siri, Google Assistant or Alexa doesn’t speak in our mother language. We are angry to our ISP if we don’t have at least 1GBPS of download speed via ethernet/WiFi connection. Or, a better example, we got mad when our we blow up our 30GB mobile data plan from our carrier. In this article I’ll explore what are the main differences as of today and why are we living in a II cold war. Now, between the United States and China presenting facts that will help you to be informed in these uncertainty times. #5G. The hoax behind Donald J. Trump and WH claims. What is 5G? This is a brand-new technology being implemented in the upcoming years that will allow any person with a 5G compatible device (Smartphone, Tablet, etc.) to navigate in the internet and download content at speeds of 1GBPS or more. This, naturally, depends of some aspects such as the carrier you have contract with, which device do you use to access (high-end smartphones, e.g., will probably grant you better internet speed rather those which are cheaper due to the hardware built-in). Who is responsible to implement 5G and why is this fuzz around Huawei caused by the President of the United States, Donald J. Trump? The responsibility to implement the 5G technology is, in first hand, every country which must promote auctions of frequencies to carriers explore and install their technology. These types of bids allow the carrier to secure a certain number of frequencies where they will base their offer to their clients. Here it’s important you know that how much bands do you carrier secures faster, better and reliable will be your 5G connection. This isn’t the news because it’s literally the same as the 4G, 3G or even the EDGE technology. So why is this fuzz? Two years ago, in 2018, the President of the United States, Donald J. Trump, started to impose tariffs on Chinese products and commodities alleging that China was practicing “unfair trade practices” which would lead to the destruction of the world’s and American economy. In the mid of this trade war, you will find the company currently leading the 5G implementation in the world. The reason that the President of the United States alleged this is that Huawei posed a “risk to national security” as according to him there was evidences — never presented — that Huawei used its routers, servers, devices, etc. to spy the United States and its citizens and send this data over to the Chinese Government. However, as everything in our life, it’s also broader than what’s being said to the press. Huawei is actually “the second-largest global seller of smartphones, surpassing Apple, Inc. (AAPL) for the first time, coming in behind number one Samsung Electronics Co. Inc.”[1] which means that for the first time the leader of tech industry isn’t an American company. According to Huawei[2] there is no history of sending over any data to the Chinese government nor to any other security agency in the world. In order to better comprehend this whole situation, the company produced, this year, a document presenting facts about why the claims made by the President of the United States, Donald J. Trump are false and misleading. In this documentpresents 9 facts that explains in detail why the allegations from the government of the United States ae false and what’s the potential impact that halting Huawei would have not only for its almost 200,000 employees but to worldwide economies who rely on Huawei technology and R&D. Source: visuals on Unsplash Other interesting example of why we are living a second cold war is also related with Chinese/USA trade war through TikTok. We almost know the coolest social media where we all post those amazing videos who made us laugh so much through the first quarantine, right? According to President Donald J. Trump, like Huawei, TikTok is being used by the Chinese Government to spy on the United States and its citizens. One more time with false allegations lacking evidence by the government of the United States. What matters now? The evidence or the perception? What really matters, in my opinion, is facts. It’s a fact that the current President of the United States is a lying machine. According to the Washington Post, the President told 22,510 false or misleading claims in 1,323 days. This is more than 17 false claims per day since he took office. Note that this count isn’t up to date as the newspaper wasn’t able to keep up the rhythm as of September, 3rd so you can count the last two or three months of the Electoral Campaign which would put him at an average of 50 false or misleading claims a day and overturning the milestone of 25,000 false allegations or claims in his presidential term(!!)! What counts, in the end, is naturally the evidence. And talking about fact check and especially about the spying we can bring up the example of Edward Snowden who was bold enough to report what were the United States doing with NSA or CIA programs. He reported that both agencies were using networks from telco companies to monitor, acquire and store millions of GB of data including SMS, phone calls, e-mails, instant messaging service (WhatsApp, etc.) and other devices including the microphone of our tv’s to listen to our conversations without a warrant. With this statement I’m not defending that the Chinese government is an example of freedom, Quite contrary. The Chinese Government is the opposite of a freedom defender. Like all the governments in the world proactively spies on its citizens and those who are inside their boarders. The Chinese Government is one of the fiercest dictatorships worldwide and one of the countries where the individual liberties are more suppressed. As citizens we must fight against all types of dictatorship even if it’s masked up or not. We have to understand the importance of privacy and what to do to protect ourselves because the idea that our country or government would protect us is false. We must fully understand these topics so we can make the world a better and safer place.
https://medium.com/@pedrodelriopr/ii-cold-war-the-geopolitical-face-of-technology-5d5bb2a6964c
['Pedro Del Rio']
2020-11-20 08:09:15.702000+00:00
['International Relations', 'Trade War', 'Technology', 'Tiktok App', 'Huawei']
234
Say What You Want to Say? Zoom Reviving “Breakout,” yet another ’80s Classic?
Say What You Want to Say? Zoom Reviving “Breakout,” yet another ’80s Classic? Corinne Drewery and Andy Connell of Swing Out Sister via Wikimedia Commons. When my wife first explained Zoom, all I could hear was Aretha Franklin singing “Who’s Zoomin’ Who.’’ Now, she’s going into detail about a meeting on Breakout Rooms, and it’s happening again. “Wait, wait, I know exactly what these are all about: another ’80s song,’’ I say, frantically setting my iPad on the kitchen table and rushing to YouTube. I can’t recall the band's name, so I type “Breakout,’’ and Google suggests “1980s song?’’ That’s it. There are many good songs called “Breakout’’ and even more called “Breakaway,’’ but few explain Zoom Breakout rooms and their purpose better than “Breakout’’ by Swing Out Sister. Let them dance again, even in a Zoom meeting And just like “Who’s Zoomin’ Who,’’ this is a song you can dance to. It moves you, and we need to be moved now. The music and words have me bouncing, bobbing my head. If you’re feeling down or blue, “Breakout’’ is just what you need to get you slightly goofy and child-like again. “When explanations make no sense, when every answer’s wrong,’’ it starts — almost like a poem or an old-fashioned jingle you can’t get out of your head. “You’re fighting with lost confidence. All expectations are gone.’’ Before you think about meds to heal your emotions, please consider playing some songs like “Breakout.’’ They might help. They certainly have no negative side effects. Like classic American standards such as “Put on a Happy Face’’ or songs that drive my wife crazy like “The Candy Man,’’ this is a true peppy “up’’ song. You prefer being anxious, serious, and scared? Then “Breakout’’ isn’t for you. This is a song about freedom of expression and joyful freedom itself. It’s about becoming who you were created to be. The key message: “The time has come to make or break. Move on, don’t hesitate. Breakout. Don’t stop to ask — and now you’ve found a break to make at last. You’ve got to find a way. Say what you want to say. Breakout.’’ Yes, I know the seriously self-righteous will dismiss this song as ’80s candy syrup, but that’s OK. I’ll beam and enjoy the music while they brood. “Breakout’’ includes a “wall of bright sound’’ combining horns, synthesizers that sound like a string section with drums, and (of course) the joyous sounds of xylophones. Everyone mixed it for dance clubs. And we felt happy. The song soared to №1 on the UK charts, and the group was nominated for Grammys for Best New Artist and Best Vocal Performance by a Group or Duo. These are the kinds of songs that allow Barry Manilow to remain popular long past the time he would have otherwise been forgotten. John Kennedy got elected to the White House with a similarly perky “He’s Got High Hopes.’’ Even George Bush Senior, famously mocked by Newsweek for “Fighting the Wimp Factor,’’ became the first sitting president since Martin Van Buren (and one of just two vice presidents in modern times) to run and win the top job. He even won 40 of the 50 states (before being crushed by Bill Clinton in 1992). Did Bush do so well in 1988 because of the popularity of Ronald Reagan or the power of Bush’s chipper 1988 campaign song, “Don’t Worry, Be Happy?’’ When you’re feeling blue, a peppy “up’’ song gets you going again. Like Katrina and the Waves singing “I’m Walking on Sunshine,’’ this song tells us to find a way, say what you feel like saying, don’t bow to authority but establish your own. And in a time of fear, and a time when you feel trapped inside a little room within a room on Zoom, what better song is there to sing?
https://medium.com/music-voices/say-what-you-want-to-say-zoom-reviving-breakout-yet-another-80s-classic-20ff7242a0a0
['Joseph Serwach']
2020-10-07 16:54:29.632000+00:00
['Music Voices', 'Music', 'Lyrics', 'Culture', 'Technology']
235
The Advent of Architectural AI
Parametricism In the world of parameters, both repetitive tasks and complex shapes could possibly be tackled, when rationalizable to simple sets of rules. The rules could be encoded in the program, to automate the time-consuming process of manually implementing them. This paradigm drove the advent of Parametricism. In few words, if a task can be explained as a set of commands given to the computer, then the designer’s task would be to communicate them to the software while isolating the key parameters impacting the result. Once encoded, the architect would be able to vary the parameters and generate different possible scenarios: different potential shapes, yielding multiple design outputs at once. In the early 1960s, the advent of parametrized architecture was announced by Professor Luigi Moretti. His project “Stadium N”, although theoretical initially, is the first clear expression of Parametricism. By defining 19 driving parameters — among which the spectators’ field of view and sun exposure of the tribunes -, Moretti derived the shape of the stadium directly from the variation of these parameters. The resulting shape, although surprising and quite organic, offers the first example of this new parametric aesthetic: organic in aspect, while strictly rational as a conception process. Bringing such principle to the world of computation will be the contribution of Ivan Sutherland, three years later. Sutherland is the creator of SketchPad, one of the first truly user-friendly CAD software. Embedded at the heart of the software, the notion of “Atomic Constraint” is Sutherland’s translation of Moretti’s idea of parameter. In a typical SketchPad drawing, each geometry was in fact translated on the machine side into a set of atomic constraints (parameters). This very notion is the first formulation of parametric design in computer’s terms. Samuel Geisberg, founder of the Parametric Technology Corporation (PTC), would later, in 1988, roll out Pro/ENGINEER, first software giving full access to geometric parameters to its users. As the software is released, Geisberg summed up perfectly the parametric ideal: “The goal is to create a system that would be flexible enough to encourage the engineer to easily consider a variety of designs. And the cost of making design changes ought to be as close to zero as possible. “ Now that the bridge between design and computation was built thanks to Sutherland and Geisberg, a new generation of “parameter-conscious” architects could thrive. As architects were becoming more and more capable of manipulating their design using the proxy of parameters, the discipline “slowly converged” to Parametricism, as explained by P. Schumacher. In his book, “Parametricism, a New Global Style for Architecture & Urban Design” Schumacher explicitly demonstrated how Parametricism was the result of a growing awareness of the notion of parameters within the architectural discipline. From the invention of parameters, to their translation into innovations throughout the industry, we see a handful of key individuals, who have shaped the advent of Parametricism. This parametrization of architecture is best exemplified at first by Zaha Hadid Architects’ work. Mrs. Hadid, an Iraqi architect trained in the UK, with a math background would found her practice, with the intent to marry math and architecture through the medium of parametric design. Her designs would typically be the result of rules, encoded in the program, allowing for unprecedented levels of control over the buildings’ geometry. Each architectural move would be translated into a given tuning of parameters, resulting in a specific building shape. Hadid’s designs are the perfect examples to this day of the possible quantification of architectural design, into arrays of parameters. Her work however would have not been possible without Grasshopper, software developed by David Rutten in the year 2000’s. Designed as a visual programming interface, Grasshopper allows architects to easily isolate the driving parameters of their design, while allowing them to tune them iteratively. The simplicity of its interface (Figure 3) coupled with the intelligence of the built-in features continues today to power most buildings’ design across the world and has inspired an entire generation of “parametric” designers.
https://medium.com/built-horizons/the-advent-of-architectural-ai-2fb6b6d0c0a8
['Stanislas Chaillou']
2020-01-03 10:38:08.562000+00:00
['Technology', 'Artificial Intelligence', 'AI', 'Architecture', 'Harvard']
236
An Intro to Brain-Computer Interfaces
Intro The concept of Brain-Computer Interfaces, or BCIs for short, is super polarized. Most people either find themselves feeling intrigued about the possibility of a computer/brain hybrid, or they just want the tech companies to get out of their damn business. I mean, putting a computer in your mind could mean your brain could be hacked, which for most people is a super scary concept. While that may seem scary, imagine you walk into Starbucks to get your morning coffee or breakfast sandwich or whatever it is you buy in the morning, and you could pay for it just by thinking about it, with BCIs, this is the future. Now, before you make any rash decisions about which end of the spectrum you lie on, let’s dive a little deeper into what BCIs actually are, and why you should probably care. Brain-Computer Interfaces are technologies that allow the brain and computers to interact, usually taking signals from all over the brain and using them to perform tasks, execute actions or respond directly. Right now, the three main types of Brain-Computer Interfaces are EEGs (electroencephalograms), ECOG (electrocorticography) and MRIs (magnetic resonance imaging). All three of these devices take signals from the brain and communicate them back to a computer for various uses. EEGs EEGs are the most common type of BCIs around today. In essence, an EEG is a cap you wear on your head that is covered in electrodes. The electrodes are then injected with an electrolyte gel that facilitates the measurement of brain activity. EEGs have their ups and their downs. The most notable upside to EEGs is the fact that they are non-invasive, they can measure brain activity at 1000 data points per second without any sort of surgery. They are cheap, instant and represent live brain activity. EEGs are great for things like helping people who are paralyzed use prosthetics and have helped people who have lost limbs be able to perform tasks they once thought impossible. Another bonus to EEGs is their very high temporal resolution, meaning the data refreshes instantly. While this may all seem like gumdrops and rainbows to some, there is a major downside to EEGs, and that is that they have terrible spatial resolution since they detect electrons firing simultaneously and are not designed to detect where in the brain the signal comes from, meaning they are less effective in treating things like epilepsy. Luckily, there is an alternative that was designed to do just that. An image of a real life EEG ECOG ECOG is an invasive type of BCI, which just means you need surgery. An ECOG is a chip (implanted in the brain), that creates a map of the brain using electrodes, the map can then be used to localize seizures and other brain anomalies. An example of ECOG tech being used today is at the Neuropace labs, where the RNS system is being developed and tested. What the RNS system does is localizes a seizure hotspot, and delivers a silent, undetectable shock that neutralizes the seizure before it happens. The system consists of a battery pack (implanted in the skull) and electrodes that are placed anywhere in the brain. This tech has proven to be successful and could help tons of people in the future. Diagram of how EEGs work versus ECoG MRI The final major type of BCI being used today is MRI. MRIs are not just BCIs (Brain-Computer Interfaces) but BCIs (Body-Computer Interfaces, a term I just coined right now). They create a strong magnetic field that forces the protons to align with it. When the magnetic field is broke, the protons realign and emit energy. Depending on how much energy is released and how fast it is, doctors can tell what type of tissue is there. MRIs are used in the treatment of many ailments, predominantly cancer. An image from an MRI Neuralink Some future innovation in the BCI department comes from none other than your very own Elon Musk, the man behind Tesla and SpaceX has now delved into the business of keeping human evolution on course with computer evolution. His vision is to merge mind and machine with Neuralink, a thin mesh-like device that is implanted in the skull and theoretically, could enable the possibility of downloadable, real-life knowledge. A pretty common analogy is the scene from the matrix where Neo downloads Karate, with Neuralink, this could become a real possibility. Conclusion In summary, BCIs are the future of human potential, they have the power to enable the control of prosthetics and artificial limbs, making super soldiers a real, daunting possibility. Perhaps less menacing is the fact that they can be used to treat horrible, life taking conditions like epilepsy. In my opinion, BCIs are the future, uniting man with machine in history’s coolest wedding and making the impossible possible.
https://medium.com/@samuelstgold/an-intro-to-brain-computer-interfaces-802927d2e139
['Sam Gold']
2019-10-05 00:26:58.813000+00:00
['Interfaces', 'Brain', 'Computers', 'Emerging Technology', 'Brain Computer Interface']
237
Your Phone Is Designed Like a Slot Machine To Keep You Addicted to it.
Ever wonder why it is so hard to get off Facebook or Twitter? Sure, partly it’s because we like connecting with our friends. And they are great places to get news. But sometimes we feel like we can’t stop scrolling. It can feel like we’re addicted. Why? Because our phone and many of the apps we use are intentionally designed to get you hooked. This article explains how they do it — why it’s difficult to stop using our phones even when we want to. I also suggest one simple hack to help you get off your phone and back to real life. What is an addiction and what causes it? First, a quick note on what “addiction” actually is. Addiction occurs when a person “uses substances or engage in behaviours that become compulsive and often continue despite harmful consequences.” Simply doing something or using substances a lot is not enough to be considered an addiction. To be an addiction, there must be excessive use coupled with harmful consequences. These harmful consequences can include negative effects on work, school, or relationships. We usually talk about addiction to substances like tobacco, alcohol, marijuana, and opioids. But behavioural addictions are real too. Gambling is a relatively common behavioural addiction. Psychologists are also starting to look at whether we can sometimes be addicted to internet gaming. Addictions are rooted in the reward centers of our brains. The reward center exists to encourage you to engage in activities that help keep you alive and propagate the species. It is activated, for example, when we eat or have sex because these are behaviours that are good for us and the species. When this reward center is activated, it causes the release of neurotransmitters in our brain that make us feel good. Dopamine is one of those neurotransmitters. Many of the substances that we become addicted to interfere with neurotransmitters like dopamine. Behaviours can do this too. For example, problematic gambling has been linked to the dopamine system. Can we be addicted to our phones? Smartphone addiction is not yet an official diagnosis in the Diagnostic and Statistical Manual, the main document used by psychologists to diagnose mental health conditions. However, we are increasingly seeing smartphone behaviour that looks a lot like addiction. Researchers have found that some people do display the classic symptoms of addiction towards their phones: they overuse their phones, they lose control of how they use them, they can become preoccupied with them, they can experience withdrawal symptoms, and their phone use can have negative effects on their social and work lives. It is important to reemphasize that just because a person uses their phone a lot does not mean that they are addicted to it. Still, there are many people that wish they would spend less time on their phones and have trouble reducing their use — even when they want to. While it may not be an addiction in the DSM, it can feel like an addiction. Why is it difficult to stop using our phones? Part of the reason that we can have difficulty putting down our phones is that they have been intentionally designed to stimulate the reward center of our brains. They use many of the same features as slot machines. Slot machines can be addicting because they have several features that are built to activate your neural reward center. They have bright, flashing lights and sounds that provide a dopamine hit and encourage us to keep playing. They also use a variable ratio schedule of reward — the wins are random. This means that, because you don’t know if you’ll be rewarded on the next pull, you want to keep doing it over and over again. Our phone apps share some of these features of slot machines. Likes. Likes are the main source of reinforcement that apps provide. We love it when we get lots of likes. Studies have shown that social stimuli — like smiling faces, positive recognition, messages from loved ones — cause our brains to release dopamine. The likes we get from social media, as well as the comments and messages, create a chemical reward that encourages us to engage more. Every time we go on the app, it’s like pulling the lever on a slot machine: we could “win” likes from our friends. If you’re a writer on medium, this is views, reads, and claps. It’s why you keep going back to your stats page and constantly refreshing. This is also what makes posting rewarding — we know that posting a picture, a tweet, or an article will result in engagement from others that we find rewarding. Lights and colours. Like the slot machines, apps provide lights and colours that make them more rewarding. Just think: would Candy Crush be nearly as interesting without all the colours? Take another look at your social media apps these days. You’ll notice that likes are not simply a thumbs up or a red heart anymore. Apps are increasingly using “micro-interactions” to provide even more of a dopamine hit. A micro-interaction “is a single use, subtle visual queue that draws your attention to a change in status. A power light on a coffee pot, or a color change on button hover are two examples.” Basically, they are the little animations that occur when you complete an action on an app or website. On Twitter, if you like something, there’s a little circle of confetti that appears. Facebook’s reaction faces move when you hover over them and when you choose them. On Medium, when you clap for someone’s article, you see a little colourful firework. Each of these is a micro-interaction. These apps are at the forefront of design, but you will start seeing micro-interactions in the majority of apps going forward. Why include micro-interactions? For the same reason that the slot machines have flashing lights: they reward us with neurochemicals. They make us more likely to keep engaging with the app. Uncertainty. Uncertainty is rewarding. Researchers have found that anticipation of a reward can provide a hit of dopamine by itself. In gambling, the anticipation that we get when we’re uncertain whether we’ll win can sometimes be as chemically rewarding as the win itself. That’s why slot machines don’t just instantly tell you whether you’ve won or lost. They let the wheels spin for a bit first. Apps do this too. The little envelope with an “M” animation that happens before Gmail opens is one example. The blue circle icon on Twitter and the “M” that appears before Medium opens are two more. These are not simply to show you that the application is loading — they would occur even if your Internet was lightning fast. They are a design feature aimed at creating anticipation. Anticipation is also why Facebook, Twitter, and others allow you to scroll indefinitely, and why we do sometimes end up scrolling indefinitely. It’s the same reason we continue to pull the lever at the slots: you never know when something good will come up. The uncertainty and anticipation of a reward encourage you to repeat the behaviour. What can I do? In response to the increasing time we’re spending on our phones, there’s been a movement towards “digital minimalism”. People are trying to do less on their phones and engage less with apps. If you are also trying to reduce the time you spend on your phone or apps, here’s one, simple change you can make: change the settings of your phone so that it is in greyscale. Basically, remove the colours. You can do this by going into your phone’s settings and looking under accessibility features. Why? Because putting your phone in black and white removes the dopamine we get from the colours in the apps, and so it makes using them less rewarding. Experiencing Facebook in greyscale makes it less enjoyable — and easier to quit using.
https://medium.com/invisible-illness/your-phone-is-designed-like-a-slot-machine-to-keep-you-addicted-to-it-6ed70b922617
['Ramsay Lewis']
2020-04-27 15:23:38.008000+00:00
['Mental Health', 'Psychology', 'Science', 'Technology', 'Self Improvement']
238
Medium vs. Amazon vs. Google
The value of a string of text. String of text. (Photo by Igor Ovsyannykov) A string of text has value to a human, who wants to know about, and, analyze, other humans. Human behavior. And, human thought. So, Medium, and Amazon, and Google, base their businesses, and business model, on the string of text. In this day and age, all businesses are based on simple strings, and text. Where the basis for the string, and text, is the zero and the one. On Medium (and, therefore, Google, and, then, eventually, Amazon) the human is attached to several strings of text. These strings give value to the human, and, also, meaning to the human. Thus, value to the string of text. A machine can read the human via the human’s string of text. The machine ‘reading’ of a human, and the human’s strings of text, also has ‘value.’ Today, the leaders monetizing strings of text are Amazon and Google. Also Facebook, and Twitter. And, of course, many others. Because everything reduces (and expands) to one simple string of text. Zero, and, one. Half of all humans do not know this until you point it out, and, then, all humans know this. Meaning, they knew it even when they didn’t ‘know it.’ Thus, analyzing strings of text, and selling both the strings, and the analysis, of the text, is what a human calls ‘intelligence.’ It all boils down to the circular relationship between the human and text, the human, and the monetization of text, the human, and the analysis of text, the human, and the machine, that can now replace the human, when it comes to text, and the analysis, and monetization, of the machine, that can also create the text. So, this forces all of us to notice, the zero and the one, is human. The zero and the one is, intelligence, text, analysis, and monetization. Not just the basis for these. The actuality of these. All because a zero and a one is circumference, and, diameter. The conservation of an uber-basic circle (also known as line) (string) (human) (machine) (intelligence) (virtual) (artificial) (real), gives meaning, and value, to Medium, Amazon, and Google. Meaning, Medium, Amazon, and Google, will, eventually, merge. Everything in nature is most basically one string of text: if one, then zero. Meaning, if two, then one. Conservation of the circle is the core dynamic in nature.
https://medium.com/the-circular-theory/medium-vs-amazon-vs-google-19dee10883bd
['Ilexa Yardley']
2017-07-26 12:14:20.687000+00:00
['Deep Learning', 'Artificial Intelligence', 'Machine Learning', 'Technology', 'Data Science']
239
5 Questions About 5G Answered
Ahh, 5G: the next-generation mobile broadband tech that’s got so many of us talking. When we dissected the consumer demand trends in 2021, we mentioned briefly that 5G represents the beginning of the future for smartphones; with that said, we don’t know a great deal about 5G. In fact, like any huge transition, the move to 5G has been plagued with confusion and a series of conspiracy theories. Some of these theories are really creative and might we sound absurd — like how 5G is linked to the spread of the coronavirus pandemic. Today, we’ll take a closer look at this fascinating development to answer some of the most burning questions that people often have about it. So, if you’ve been curious to learn more about the shift from 4G to 5G wireless connectivity, keep reading! 1. How is 5G different from 4G? 5G vs 4G: Speed We all want quicker connection speeds and the switch to 5G will see just that. For one, 5G is believed to be at least 20 times faster than 4G. With 4G you get somewhere between 10Mbps and 50Mbps. 5G, on the other hand, could get you up to 50Mbps on average. 5G vs 4G: Latency For the uninitiated, latency refers to the time it takes for data from your device to be uploaded and reach its target. It measures the time it takes for data to go from source to destination in milliseconds (ms). Latency is particularly important to avid gamers because where games are involved, response time can greatly impact the outcome. With existing 4G networks, the average latency you’ll expect to see is around 50ms. With 5G networks, you can expect an average of 10ms. 5G technology may even drop that down to 1ms. More for you: 2. Does it mean that 4G will become obsolete? Many of us are still using 4G networks. In fact, many of us still rely on a 3G network when a 4G network isn’t available. In the same vein, 4G isn’t going to disappear overnight with the arrival of 5G. In fact, when 4G and 5G work hand in hand, consumers benefit from getting a decent connection speed on their mobile devices wherever they are. Furthermore, as 5G infrastructure improves, 4G networks will, too, resulting in faster speed for all. 3. What are some changes and innovations that 5G will power? 5G networks will help us work more efficiently, boosting productivity | Photo on Freepik The 5G technology is triggering and will continue to trigger unthinkable possibilities. We’ll look at just two of them here. Autonomous vehicles Driverless cars must collect a large quantity of data, process it locally, and then transfer it to the cloud. This information is then sent back to the car, allowing it to make safe judgments. Vehicles will receive real-time updates on hazards that develop beyond the line of sight, allowing vehicles to respond safely and immediately. This technology will be enabled by 5G networks. Changing the way we work Many of us are used to working from home since the start of the COVID pandemic. With 5G technology, we’re going to see even more changes in the way we work. Businesses can take advantage of the technology to have better phone calls, higher-quality video meetings, VR meetings, or even deploy AI-enabled tools via cloud-based apps to accelerate workflow. Now, even tasks can be accomplished remotely because 5G’s low-latency and high-frequency data transfers will make engineering and many other types of highly skilled work possible from anywhere as long as there is a decent connection. 4. Will 5G have adverse health effects? We mentioned earlier that 5G has been linked to the coronavirus, but that’s not all. 5G has been rumored to be associated with a slew of adverse health effects. For example, some people claim that 5G causes cancer or brain tumor. But we’re saying: it’s all fake news. There’s no evidence that 5G is unsafe for use, and the myths of the adverse health effects of 5G have been debunked. In fact, it is quite the reverse. 5G technology actually enables remote surgery, which will have massive implications for the healthcare industry. We don’t know about you, but that doesn’t sound bad to us at all. 5. What 5G devices are available today? If you’re looking to get your hands on a 5G smartphone, here are some you could consider: iPhone 12 Pro Max Featuring the most 5G bands on any smartphone, iPhone 12 Pro models offer the broadest 5G coverage worldwide. Furthermore, with a huge battery, you’re going to have enough juice to enjoy your 5G network thoroughly. The phone also has a massive 6.7-inch screen for you to Netflix and chill to. Samsung Galaxy S21 Ultra The Galaxy S21 Ultra shows us what we get when powerful camera performance meets epic 5G entertainment. With a large 6.8-inch AMOLED display and S Pen support, you can fully make use of this phone’s 5G support. Google Pixel 4a 5G If you’re willing to forgo the 90Hz refresh rate found on the Pixel 5’s display, Google Pixel 4a 5G is a great 5G phone with top-notch features. You’ll get a 2MP main shooter and 16MP ultrawide angle lens, and AI-powered software features. The Pixel 4a 5G works with every 5G network. More for you: Conclusion The advancement of 5G technology will have an impact on more than just the devices we presently use. Mobile phones, gaming, and other apps will all use 5G. Things that were previously only imaginable in science fiction and fantasy will become possible with 5G.
https://medium.com/ezewholesale/5-questions-about-5g-answered-b9f27db2ab3
['Eze Wholesale']
2021-07-30 18:29:07.303000+00:00
['5g Technology', 'Consumer Electronics', 'Technology', '5g']
240
JavaScript Basics — Objects and Inheritance
Photo by nicolas reymond on Unsplash JavaScript is one of the most popular programming languages in the world. To use it effectively, we’ve to know about the basics of it. In this article, we’ll look at JavaScript objects and inheritance. Static Members JavaScript classes can have static members. They’re shared between all instances of a class. We indicate that something is static with the static keyword. For instance, we can write: class Cat { static type() { return 'cat'; } } We have the static type method that we call with the class directly: Cat.type() Then we get 'cat' . Also, we can write: class Cat {} Cat.type = 'cat'; to add a static property to a class. Since properties can’t be inside a class, we’ve to attach it outside. Inheritance In JavaScript, we can create subclasses to inherit methods from a shared parent class while adding members unique to the subclass. For instance, we can write: class Cat { speak() { //... } } class NiceCat extends Cat { fly() { //... } } We use the extends keyword to indicate that we’re inheriting members from the Cat class in our NiceCat subclass. The speak method is available to both classes while the fly method is only available in the NiceCat class. To call the constructor and methods of the parent class, we can use the super keyword. For instance, we can call the parent constructor by writing: class Cat { constructor(name) { this.name = name; } speak() { //... } } class NiceCat extends Cat { constructor(name, color) { super(name); this.color = color; } fly() { //... } } The NiceCat has the super keyword to call the Cat constructor with the name parameter. It must come before anything that modifies the current class. So we have the super call before setting the color property. To call a parent class’s methods, we also use the super keyword. For instance, we can use it as follows: class Cat { constructor(name) { this.name = name; } speak(words) { return words; } } class NiceCat extends Cat { constructor(name, color) { super(name); this.color = color; } fly() { //... } niceSpeak(words) { return `nice cat says ${super.speak(words)}` } } We added the niceSpeak method which calls Cat ‘s speak method by using super.speak . instanceof We can use the instanceof operator to find out whether an object we derived from a specific class. The left operand is the object we want to check and the right operand is the class or constructor. For instance, we can write: console.log(new NiceCat('joe', 'brown') instanceof Cat); Then we would get true since it’s an instance of Cat . Likewise, if we write: console.log(new NiceCat('joe', 'brown') instanceof NiceCat); We would also get true logged. Photo by TOMOKO UJI on Unsplash Conclusion Classes can have static members. We can use the static keyword to indicate static methods. If we want static properties, we can attach it directly to the class. We can use the extends keyword to create a subclass of a class. The instanceof operator is useful for checking if an object is an instance of a constructor. JavaScript In Plain English Enjoyed this article? If so, get more similar content by subscribing to Decoded, our YouTube channel!
https://medium.com/javascript-in-plain-english/javascript-basics-objects-and-inheritance-4d385625b0f8
['John Au-Yeung']
2020-07-12 20:20:18.248000+00:00
['JavaScript', 'Web Development', 'Software Development', 'Technology', 'Programming']
241
After The Satoshi Roundtable, Is There A Way To Bridge The Bitcoin Divide?
Originally published in TechCrunch on March 13, 2016 A recent rift amongst the developers of Bitcoin, which originally started with a question over increasing the so-called block size (so that throughput of transactions can be increased), exposed deep divides about distributed governance; and has now ironically led to entrenched positions, flared tempers, public insults, accusations and disparaging remarks. The Rift The opposing views of those advocating for preserving the current implementation of Bitcoin (Bitcoin Core), and those who believe that the block size needs to be increased immediately to overcome scalability challenges, has balkanized the Bitcoin developer community into mainly two camps. A dramatic public characterization of this rift came from Mike Hearn’s January blog post on Medium, provocatively claiming that the “Bitcoin experiment has failed” as a consequence of the community’s unwillingness to increase the blocksize: “Why has Bitcoin failed? It has failed because the community has failed. What was meant to be a new, decentralised form of money that lacked “systemically important institutions” and “too big to fail” has become something even worse: a system completely controlled by just a handful of people. Worse still, the network is on the brink of technical collapse. The mechanisms that should have prevented this outcome have broken down, and as a result there’s no longer much reason to think Bitcoin can actually be better than the existing financial system.” The price of Bitcoin plummeted to $358 following Mike’s post, but has since recovered to $418 at the time of writing. Alternatives have emerged, such as Bitcoin Unlimited, which has no hard coded block size limit; Bitcoin Classic, which would increase the blocksize to 2 megabytes; and BitPay Core, which would have an adaptive block size limit. Earlier this month, the various factions convened the Satoshi Roundtable, an invitation-only gathering of fifty or so people, who turned out to be a collection of many of the most important people behind Bitcoin. As a Bitcoin-voyeur I don’t have a horse in the race, but Bain Capital Ventures is invested in several dozen Bitcoin and blockchain startups through the Digital Currency Group. An unlikely man with an unusual background, Bruce Fenton, conceived and organized the roundtable. Fenton is a self described freedom-loving techno geek. Like many in the blockchain industry he is a free-market economist and advocate for lower regulations. He’s one of the more connected people in Bitcoin (he was elected chair of the Bitcoin Foundation in 2015) and more surprisingly, one of the rare neutral parties on this issue. He is married, father of four, travels half the year, spends a lot of time in the middle east, almost ran for Congress, and in the past has provided advisory services to some of the world’s largest investors, including Bain Capital. It’s remarkable that he was able to cajole this group of 50 to invest their own time and money to gather together to resolve this rift. The original vision of the Roundtable was not grand — it was only to be a fun getaway with friends to talk about the industry — but in only its second year it has grown to be a premier private gathering. So the core goal for this year’s roundtable was to try and resolve the rift and have a unified Bitcoin. Leading up to the event, the community appeared to be supportive. This was in contrast to the first meeting of the Roundtable, where the sentiment appeared to be, “Who are these guys and why are they meeting in secret? Who do they think they are, the Illuminati?” This year, it was, “Do it, guys. Find a resolution.” The players There were CEOs of important Bitcoin companies including exchanges, wallets, alternative blockchains and cryptocurrencies. There were CEOs from enough miners (operators of the hashing servers that verify transactions) to represent more than 50% of the hashing capacity of the Bitcoin network (miners can “vote” with their servers on important issues such as which version of the Bitcoin code to run). But most importantly many of the key open source developers were present (I say “most importantly” because at least for now, they have the greatest control over and strongest opinions about how the Bitcoin technology evolves). Both contingents of the major rift were amply represented. On one side of the rift is the Bitcoin Core team, who are contributors to the main Bitcoin source base, and take the responsibility of maintaining and enhancing it, and more importantly, prioritizing the numerous suggestions for enhancement. Adam Back, Matt Corallo, Peter Todd, Eric Lombrozo, Alex Morcos, and Luke Dashjrwere the Core developers at the Roundtable. For over a year, they have taken the stance that making a one-off enhancement to increase the blocksize does not serve Bitcoin well; that yes the block size needs to be increased, but it should be done later, perhaps second or third or fourth, as part of a larger set of important changes. Most importantly, they prioritize high consensus among miners and safety (i.e., upgrading without breaking Bitcoin). As Matt Corallo put it: At this point, however, the entire Bitcoin community seems to have unified around a single vision — roughly 2MB of transactions per block, whether via Segregated Witness or via a hard fork, is something that can be both technically supported and which adds more headroom before second-layer technologies must be in place. Additionally, it seems that the vast majority of the community agrees that segregated witness should be implemented in the near future and that hard forks will be a necessity at some point. With the apparent agreement in the community, it is incredibly disheartening that there is still so much strife… On the other side of the rift are primarily Gavin Andresen, but also Roger Ver, Brian Armstrong, Peter Smith, and various other wallet providers and other CEOs Gavin’s stance is that Bitcoin is choking under the weight of its own success, and the block size needs to be increased immediately, possibly multiple times. After being unsuccessful in convincing the Bitcoin Core team to prioritize increasing the blocksize, Gavin has created an alternative implementation, called Bitcoin Classic. Bitcoin Classic makes only that one change of increasing the blocksize; some miners think this is thus the safer alternative, purely by virtue of the fact that it consists of vastly fewer changes. What was discussed Tensions had been high prior to the Roundtable, but the conversation started off constructively. The group began by finding common ground around two things: (1) a code of conduct, or at least, rules for interactions amongst developers who strongly disagree, and (2) tenets about Bitcoin that everyone agreed on. There was agreement that the rift was hurting everyone. There was also agreement that Bitcoin splitting into two blockchains would be very, very bad; transactions would be lost or duplicated; that it would be unclear what the real Bitcoin was; that the reputational damage would be significant. Although on the next day when consensus appeared more difficult, there were some contrary opinions that maybe it wouldn’t be so bad because one or the other blockchain would survive as the real Bitcoin. Governance Everyone agreed that the problem started off over blocksize (and that still is the immediate problem) but the inability of the community to resolve that problem has surfaced a larger problem: the desperate need for governance. But this is a seriously hard problem to solve because (a) bitcoin is all about decentralization and not having the need to trust a central entity, (b) Bitcoin is not controlled by any one body or even jurisdiction and © this group, and the community in general, reacts very negatively to any suggestions for a governance structure or body. The fifty or so people broke up into seven different subgroups which at times came together and at times separately over lunches, dinners and drinks. A group which included a large miner and a large exchange operator felt that the Hong Kong agreement was fine, as long as the timeline for the block size increase was moved up. Another group felt that the go-forward goals should be to encourage more companies to mine, to contribute to the decentralization of Bitcoin, with two distinct sub-goals: the decentralization of mining, and the decentralization of nodes. Periodically the groups reminded each other that they have a credibility problem and that outsiders looking in are not impressed by the impasse. Over the course of the group discussions, some called for a more civil discourse among the two sides of the debate, while others said that the divergent approaches were not actually mutually exclusive and that both approaches could be executed if the factions could agree upon a timeline. While some took a defeatist tone, pointing out that some concerns should be so-called Layer 1 concerns (Layer 1 is the fundamental layer of Bitcoin, the layer that this group was concerned with), and some should be Layer 2 concerns (i.e., another layer that takes care of less fundamental things, similar to networking protocol stacks). They pointed out that some were short-term goals (block size) and some were long-term goals (governance, and how we decide these things in the future). Over the course of the program, people brought up the need for changes to what’s called consensus rules, the separation of those consensus rules from the rest of the code, and the governance of those consensus rules. Consensus rules are the code that determines whether there is consensus among the miners, a tallying of votes if you will, which can be used to decide many things, including… wait for it… which fork of the Bitcoin code to run. A third group proposed that we allow more than just miners to vote, by first rolling out the ability to encode votes in transactions, with parameters such as coin-age to control who gets to vote when. This had been discussed prior to the Roundtable and most people seemed to think that the idea had legs and a variation of the idea could work; but debating which variation was another governance issue. They warned the Roundtable to not take the hard fork scenario lightly because “nobody really knows what’s going to happen in a hard fork.” They asked everyone to stop looking for a great option, or even a good option, and just find the least bad option. (I am convinced that I was the only one amused by the fact that in the real world, minors cannot vote; while in Bitcoin, only miners can vote. I did not waste the group’s time by making this pun.) A noted crypto scientist, and creator of one of the other crypto-currencies similar to Bitcoin, suggested that we be open-minded to not reaching compromise, because it’s too hard to negotiate when you’re right next to a cliff: “Folks, I mean this in all sincerity, we shouldn’t be trying so hard to reach consensus on block size and governance. It’s okay that we will continue to have differing, incompatible opinions and strategies. I think negotiation and compromise is made harder by thinking that there is no alternative, that you have to reach consensus or it’s the end of the world. It’s not the end of the world, and everyone involved knowing what their options are in the ​*absence*​ of agreement, and seeing the value in those options, is necessary for real, voluntary cooperation.” One of the leaders of the Bitcoin Foundation offered, “Governance may not be best term. We do need standardization of the protocol and a widely-accepted roadmap for implementing and releasing improvements. This is hard without a leader, a governance model, or other form of consensus. Perhaps we need to establish a protocol or accepted way of choosing the path forward so we can make progress as a group.” A few people said, let the miners decide, and let’s be done with this. Others were adamant that that was very narrow thinking. “Miners are very very short term in their views; some of them turn mining on and off by the hour,” one expert added. The miners present at the Roundtable, interestingly, did not debate this, and did not seem to disagree. During panels and over poker (where only Bobby Lee, CEO of the Chinese Bitcoin mining company BTCC, had enough Bitcoin, U.S. Dollars, Euros, Renminbi, and cell reception to be the bank for all the players) one other large miner said, “I don’t want this power. Don’t give me this power. I don’t want to be in a position that we have to do a emergency hard fork. The game is over if we have do that. Bitcoin will never be the same. Bitcoin is safer if it’s a simple change. A small change. That’s why we’re supporting Bitcoin Classic, because it’s the smallest simplest change.” The Gauntlet Brian Armstrong, CEO of Coinbase, warned that Bitcoin as a protocol and as an industry is in a bad place, a dangerous place. And then he went on to articulate a very, very strong and controversial stance: “We need a new team working on the bitcoin protocol. Bitcoin Core is a good team. Bitcoin Classic is a good team. But we need a better team — a great team — working on the protocol. I’m thinking of going and doing it.” I felt comfortable mentioning him by name because he blogged about it later(otherwise we had agreed that we could write about what people said, without mentioning them by name): But as the conversations went on, Armstrong became less and less concerned about what short term solution we pick because he realized we all had a much bigger problem: the systemic risk to bitcoin if Bitcoin Core was the only team working on bitcoin. They prefer to withhold something that could help the network now, because they don’t trust the community to make educated decisions in the future. They view themselves as the central planners of the network, and protectors of the people. Armstrong then outlines a couple of very troubling failure scenarios that we discussed at the Roundtable, and concludes: If you want to ensure Bitcoin’s success, I’d encourage you to upgrade to Bitcoin Classic in the short term and then do what you can to help with the three step plan I outlined above. This is the best path forward to mitigate the dangerous situation we’ve found ourselves in. I find the stance impressive and I admire him taking the bull by the horns. Just because we upgrade to Bitcoin Classic now, does not mean we need to stay with Classic forever. It is a risk mitigation option. It’s tough for the average reader to verify the research, though. Particularly, what I think is the crux: [The Bitcoin Core team proposal] will require not only new bitcoin core code, but also new code to be written by each of the major wallet providers who are generating transactions. It is unlikely this will be done in time to avoid the scaling issue we are currently facing. The number of lines of code that need to be written for this across the entire industry will be several orders of magnitude more than a scaling solution of changing 1MB blocks to 2MB blocks. This was explained to core developers at the conference and it didn’t seem to change their opinion of what the best short term solution was to scaling. Strong reactions and flames were inevitable. I don’t share the concern that this is Coinbase’s ploy to “take over” Bitcoin — that would be very, very difficult. The Bitcoin Core team’s perspective on why the Bitcoin Classic proposal is ill-advised is: Unlike a soft fork, which is what the Bitcoin Core team is recommending with segregated witness, miners, merchants, developers, and users have never deployed a hard fork, so techniques for safely deploying them have not been tested. Hard forks require all full nodes to upgrade or everyone who uses that node may lose money. Even a single-line change such as increasing the maximum block size has effects on other parts of the code, some of which are undesirable. My Conclusions Bitcoin may be the most important thing happening in fintech right now. The problem of how to govern a cryptocurrency that was explicitly designed to not have a central controlling authority is philosophically intriguing. I believe that the players involved are well-intentioned reasonable people. All of them have a common goal: to make Bitcoin successful. They just disagree (very, very strongly) on how to get there. There is some ego involved. There is some jockeying for power or control. But there’s enough attention and concern over this problem that I think they’ll work out the short term problem (block size) and begin work on the even more difficult long term task of distributed governance without empowering a central authority. I put blockchain technology into two categories: (1) the popularity contest Bitcoin needs to win as a store of value (2) everything else. The “everything else” includes other public blockchains, private blockchains, Layer 2 “applications” on public or private blockchains, alternative cryptocurrencies, etc. Category 2 is interesting, and there will likely be a few billion-dollar companies created there. I liken companies in this category to companies such as Docusign or Salesforce.com, i.e., enterprise technology companies that help us do business better. But category 1 is really intriguing because over the next fifty years Bitcoin could develop into the de-facto non-representative non-fiat store of value, even if it’s not the currency we use to transact every day. As I’ve written before, as a store of value and an alternative to fiat currencies that have been conjured up by sovereigns in which one must put one’s faith, it continues to increase in suitability. In the United States and in western Europe we take for granted the luxury of having institutions — the Federal Reserve and the European Central Bank respectively — that have central banking skill, experience, credibility and competence. Other central banks are less competent, less independent, more political or corrupt, and most importantly cannot undo the errors of their sovereigns, such as autocratic nationalism, and running economies that are overly dependent on a commodity. This is more readily evident in small sovereigns, often in Africa or Latin America, as their economies fail or currencies devalue. Central banks are run by people and thus will always be susceptible to such errors, which is why non-fiat representative stores of value such as gold, have a place. And for this reason I continue to think that non-fiat stores of value that are more convenient to manipulate and transport than gold will have a bigger place in the future. Are the reasoned (albeit heated) discussions we are having about the Bitcoin protocol preferable to inexperienced, incompetent, and often opaque central banking? I think so. It makes sense to buy Bitcoin during the panics, and I’m expecting a few panics over the next few months as things resolve themselves. But then again, I’m a venture capitalist.
https://medium.com/ideas-from-bain-capital-ventures/after-the-satoshi-roundtable-is-there-a-way-to-bridge-the-bitcoin-divide-42a6990122cb
['Salil Deshpande']
2017-06-06 16:23:31.596000+00:00
['Bitcoin', 'Currency', 'Blockchain Technology', 'Blockchain']
242
Daily Bit #177: A Bootstrap State of Mind
Top Story A Bootstrap State of Mind It’s been less than three months since OkCoin landed on U.S. soil, though the exchange has already broadened their bandwidth to an additional 20 states beyond their California HQ. Prior home: China. OkCoin was booted from the country alongside Huobi in October ’17 after the PBoC draped an iron curtain over domestic crypto exchanges. Huobi broke into the US through their SF-based partner, HBUS, and OkCoin tailed shortly after. A U.S. expansion is a big deal. It signifies that platforms are willing to tack-on regulatory baggage in exchange for breaking into new markets. And new markets = faster growth = more cash to OkCoin’s war chest for future maneuvers. Eyes on the prize(s): volume & fees Thanks to their expansion, OkCoin’s trade volume is ready for launch. Current volume per Coinmarketcap is roughly $700k per day, but that’s only for Cali. All five currencies are paired to USD; new states must use Tether or TrueUSD until OkCoin gets a nod for fiat-to-crypto trading. Now, remember… one does not simply onboard new users. OkCoin is barreling into a dragon den occupied by Coinbase and Gemini, whose respective ~$165M and ~$31M in daily trade volume isn’t about to get packaged into a welcome basket. The answer? Slash fees — heavy. And OkCoin’s promotion gives them a heady edge over maker and taker rates offered on both platforms for certain price levels. Fees are based upon 30-days of trading volume. Taker fees (market-orders): - Coinbase, under $10m (0.30% flat vs 0.15% max) - Gemini, under $5m (0.25% [min] vs. 0.15% max) Maker fees (limit-orders): - Coinbase, above $1m (both charge 0.0%) - Gemini, under $15m (0.10% [min] vs. 0.05% max) The bottom line: There’s no guarantee that OkCoin will keep retail & institutional investors around for a long time, but they can offer them a good one while their cheaper fees stick around… as long as their lower liquidity levels aren’t a dealbreaker.
https://medium.com/the-daily-bit/daily-bit-177-a-bootstrap-state-of-mind-bc48e3d240fb
['Daily Bit']
2018-09-13 14:00:33.127000+00:00
['Fintech', 'Blockchain', 'Blockchain Technology', 'Cryptocurrency', 'Bitcoin']
243
Website Dashboard UI Examples Inspiration 80
Best Website Dashboard UI Examples for Design Inspiration — #80 Digital dashboard is one of the key elements of today’s digital internet environment. We use them for tracking web analytics, for working with our sales leads, for managing our client base and much more. Digital dashboards may be laid out to track the flows inherent in the business processes that they monitor. Graphically, users may see the high-level processes and then drill down into low level data. This level of detail is often buried deep within the corporate enterprise and otherwise unavailable to the senior executives. We decided to create a weekly series of inspirational feeds that showcase some of the best examples of web dashboards.
https://medium.com/theymakedesign/website-dashboard-ui-examples-inspiration-80-3e063678713c
['They Make Design']
2020-12-10 07:01:56.086000+00:00
['Visual Design', 'Design', 'Inspiration', 'Technology', 'Business']
244
NLP — Zero to Hero with Python and More!
NeurIPS, the largest conference in artificial intelligence, is currently underway, and it has over 20k people registered. If you are not registered and would like to access their goodies, please visit this public access version of the NeurIPS website. If you are into deep learning, we recommend you to check out this phenomenal tutorial by David Duvenaud, Zico Kolter, and Matt Johnson, which makes use of many tools such as Anderson acceleration, differential equations, neural nets, convex optimization, Jax, automatic differentiation and others, presented on NeurIPS. Next, we recommend you to check out this article titled “We read the paper that forced Timnit Gebru out of Google. Here’s what it says” by Karen Hao from MIT Technology Review, which gives a very insightful overview of what caused the departure of Timtit Gebru, co-lead ethical AI researcher from Google Brain. For those interested in natural language processing, Carnegie Mellon Professor Graham Neubig just published 23 class-lectures on multilingual natural language processing, including two guest lectures by Pat Littell and Orhan Firat. The video playlist can be accessed for free on Youtube. Last but not least, Paul Liang and Misha Khodak from ML@CMU published a post containing all of CMU’s submissions to NeurIPS 2020, with many goodies, from papers to code, and much more.
https://medium.com/towards-artificial-intelligence/nlp-zero-to-hero-with-python-and-more-6f5968e96f1c
['Towards Ai Team']
2020-12-08 17:22:53.925000+00:00
['News', 'Artificial Intelligence', 'Science', 'Future', 'Technology']
245
LeetCode Solution 1. Two Sum
Given an array of integers nums and an integer target , return indices of the two numbers such that they add up to target . You may assume that each input would have exactly one solution, and you may not use the same element twice. You can return the answer in any order. Example 1: Input: nums = [2,7,11,15], target = 9 Output: [0,1] Output: Because nums[0] + nums[1] == 9, we return [0, 1]. Example 2: Input: nums = [3,2,4], target = 6 Output: [1,2] Example 3: Input: nums = [3,3], target = 6 Output: [0,1] Constraints: 2 <= nums.length <= 105 -109 <= nums[i] <= 109 -109 <= target <= 109 SOLUTION 1:(One-pass hash table) The optimal solution to solve this problem is using a HashMap. For each element of the array, (target-nums[i]) and the index are stored in the HashMap. class Solution { public int[] twoSum(int[] nums, int target) { if(nums==null || nums.length<2) return new int[]{0,0}; HashMap<Integer, Integer> map = new HashMap<Integer, Integer>(); for(int i=0; i<nums.length; i++){ if(map.containsKey(nums[i])){ return new int[]{map.get(nums[i]), i}; }else{ map.put(target-nums[i], i); } } return new int[]{0,0}; } } Time complexity of this algorithm being O(n). Using the one-pass hash Table: The one-pass hash table .While we iterate and insert the values in the table .We also look back at the hash map to check if the complement of the value is already present in the hash table. If it exists then the 2 numbers which form the sum which is equal to the target is returned. Solution 2:(Brute force) The brute force method is really simple.We iterate through all the elements of the array ,If there is any other value where target-x is equal to 0 then it returns the indices the array. class Solution{ public int[] twoSum(int[] nums, int target) { for (int i = 0; i < nums.length; i++) { for (int j = i + 1; j < nums.length; j++) { if (nums[j] == target - nums[i]) { return new int[] { i, j }; } } } throw new IllegalArgumentException("No two sum solution"); } } Time complexity of this algorithm being O(n²). Related post:
https://medium.com/@nisargdevdhar/leetcode-solution-3cfbd1f0b833
['Nisarg Devdhar']
2020-09-27 14:22:33.630000+00:00
['Leetcode', 'Java', 'Solutions', 'Technology', 'Programming']
246
When social media comes crashing down, THIS is what to do…
In case you didn’t hear the news — Google and YouTube both crashed yesterday. It was only for an hour or so, but when you’re desperately trying to blast through episodes of “Terry and June”, it can be a bit frustrating. Anyway, I know what you THINK I’m going to say about this… As an email marketer, you think I’m going to say: “SEE? This is why you need to focus on building an asset you own and start emailing your list…” You’re also expecting a “building your house on sand” analogy too, aren’t you? If that’s the case, I’m going to disappoint you. Yes, the fact that two of the internet’s largest websites went down yesterday should terrify you a little bit, but make no mistake… … email ain’t perfect either. It’s not a panacea. For reasons that I’ll get to in a future message, email has its fair share of problems too. (Here’s a clue: have you looked at your email providers T&C’s recently?) When Google and YouTube crash, it’s a reminder you should look at ALL the online tools you use and ask: “If this went down tomorrow… what would I lose… and what would I do next?” Doesn’t matter if it’s Facebook, Twitter, MySpace, or Tinder… If you logged in tomorrow and were greeted by a screen that said: “I’m sick of going to Congress, so I’ve cashed out and moved to Berwick-upon-tweed. Love you, Zucks.” … what would you do? And if you’re sitting there gloating, thinking: “I’m OK… I have an email list”, let me ask you a question: When was the last time you downloaded it? And if you have, where do you keep that precious CSV file? Is it in one place, or do you have backups, just in case? Social media.. email… YouTube… Google… they’re all just tools and, like any tool, they can break and go wrong. Your job isn’t to find one only to bitch and moan when the handle snaps… … it’s to take responsibility for your business and always have a plan B. To always know the answer to the question — “if this went away, what would I do?” Having said all that… Email probably IS the best place to start. That CSV file full of people who raised their hands, saying “I want to hear from you” is a pretty awesome asset to have in your back pocket should Zucks decide to delete his FB account. If you don’t have an email list, start. Not because email is king… … but because it gives you more power and control over your business and your life.
https://medium.com/@johnholtcopywriter/when-social-media-comes-crashing-down-this-is-what-to-do-b77ca214f31f
['John Holt']
2020-11-16 07:31:49.146000+00:00
['Small Business Marketing', 'Social Media Marketing', 'Marketing Technology', 'Marketing Strategies', 'Copywriting Tips']
247
Where the Chini ya Majis @?
Well, let me tell you…. Following the recent Chini ya Maji post I thought it would make sense to shed some light on what some of these intrepid “deep sea” entrepreneurs are working on. So strap on! The Real Tenderpreneurs There is this team that is quietly and elegantly solving corruption. They are building a tender publishing and procurement management system. Procurement is clearly a massive horizontal market in an important operational segment for companies of all sizes across the continent. Any well ran company of meaningful size has procurement as one of their internal functions. Particularly in light of the recent anti corruption crackdown procurement is an area crying out for innovative solutions that increase transparency, visibility, and control for organizational leaders. The heartwarming part is how these guys have gone about tackling the opportunity. First, they established a tender publishers subscription business which provides a steady revenue stream allowing them to turn their focus towards the more complex aspects of developing a platform to solve the workflow, collaboration, and reporting associated with internal procurement practices. They have already completed version 1.0 of the platform and are taking sales meetings. By virtue of Impact Africa Network’s ecosystem catalytic accelerator I have had the privilege of working with this team helping them apply a strategic lens to their business; interrogate their business model, define a focus, and develop clarity around a go-to-market strategy. Fifteen years selling software helps a ton with this type of stuff. I am excited about what this team can accomplish. They have a genuine opportunity to build a massive company attacking a ubiquitous, pernicious problem that is a massive drain on African economies. If they are successful tendering and procurement could receive a massive transparency boost the positive effects of which would reverberate across economies. This is an example of a technology solution tackling a massive structural problem. Love! These guys are smart, gritty, high energy, and eager to learn. I can’t wait for our next strategy session with them. Get em boys! Mr. Akili Mingi There is the gentleman I affectionately nick named Akili Mingi (I tend to do that). Dude has been in the tech game since IBM main frames days, and, by his own admission is getting tired and wants an exit strategy. Akili has been at the helm of his startup for over 7 years. His company was referred to me by a repeat customer who, having taken a role at a new company had made it part of her condition of employment that the company replace the existing HR solution with Akilis platform. Customer loyalty is the most reliable sign of awesomeness. When Akili and I sat down for the first time to dialogue about potential growth capital he confessed right away that he needed a new approach to running the company. The bear-hug-bootstrap model was becoming increasingly unsustainable and at the same time offered no clear exit path. As a company they have done quite well in my view. They have close to 80 customers across the region, including Uganda, Rwanda, Tanzania, and just last week he was in Addis completing a deployment at the oldest winery in Ethiopia. Did you know they do wineries up there? I didn’t. How did he get into this deal? A South African ERP company had recommended him. Apparently, the winery is undergoing a full scale tech modernization project. Mark my words friends, this is that canary in a coal mine portending a continent wide tech adoption cycle by businesses over the next 10 years which will drive massive opportunity for startups that are prepared. Historically, Akili and co. had won a couple of key Enterprise deals against a big UK based competitor that has an aggressive local sales operations going after the HR software market. By all accounts this English company appears to be the big gorilla in that space. But Akili had nary an insight into why or how they had won those deals. Maybe its because during the meeting he had proceeded to showcase at least two other products/startups he was pursuing, one was a school management system, another a payments solution chasing the revenue collection opportunity associated with devolution. Homie has side hustles on his side hustles. With such frame of mind market data and related insights are not something one traffics much in. You chase opportunity as it presents itself and the only data of note is sale or no sale. Unfortunately, without the insights that come from evaluating ones operational data it is impossible to understand your business let alone craft a reliable plan for growth. But goodness me does this business have potential! Another impressive thing about Akili is his level of self awareness. It is off the charts. He readily acknowledges both his and his company’s weaknesses. I enjoy sending him meaningful startup related educational content which he gobbles up and appreciates. I do this with all the founders I am in contact with which is a growing list of 30+. I believe part of an investors job is to curate knowledge, information, and contacts for busy entrepreneurs. One idea Akili and I are considering is to acquire a majority stake in the company, bring on a whip smart young team, empowering him to take a higher level role such as chairman to offer guidance to the new team. There is a great deal of local talent hungry to be immersed in exciting projects related to building the new African economy. Akili’s company could very well be one of them. It would provide a win-win-win outcome across the board. Watch this space, more anecdotes from the trenches to come…..
https://medium.com/impact-africa-network/where-the-chini-ya-majis-at-8ae12fe748cb
['Mark Karake']
2019-03-18 09:36:17.764000+00:00
['Technology', 'Startup', 'Africa', 'Nairobi', 'Venture Capital']
248
Home Alarm System Suppliers In Hyderabad
Godrej gives the best home alarm system like Infrared LED for night vision, Audio-video recording, water&dust resistant outdoor unit, Built-In loudspeaker for the siren and two way talking and much more equipment which we do bring the best. We provide the top class service to be done by the highly talented technical personnel which consists of the highest quality products. We build a quality relationship with our customer without any loopholes in our work system. We make a good investment of time and service with customer Our MNR Global Technologies will give the best home Alarm system suppliers in Hyderabad and will become the most trustworthy service to our customers
https://medium.com/@mnrglobaltech/home-alarm-system-suppliers-in-hyderabad-b740d8983911
[]
2020-12-26 10:04:48.382000+00:00
['Hyderabad', 'Home', 'Homealarms', 'Mncglobal', 'Technology']
249
Easily build a REST API with Spring Framework
# The Concept Before I even start explaining how to create an API, you should first know what this thing called “API” is. To begin with, API stands for Application Programming Interface, Wikipedia explains it as: A computing interface that defines interactions between multiple software intermediaries. It defines the kinds of calls or requests that can be made, how to make them, the data formats that should be used, the conventions to follow, etc. Let’s try mapping it to a specific example: Imagine you have a portfolio web page, where you share some information about yourself, like interests, side-jobs, professional experience, projects, and so on. Now you want people who follow you, to be always up to speed about your Spotify preferences. You may think about manually going to your Spotify account and copy your liked songs, it may work if you are an inactive user, however if not, it will quickly become a cumbersome task. That’s when the API word comes in, Spotify and a lot of other companies already developed their APIs, so people like you can avoid problems like this. developer.spotify.com In such a context, you will simply need to contact their API and request all the data that may fulfill your needs. In this case, your liked songs. An API can be described as an application that exposes services without the need of having a front end. Let’s now create our API example so we can easily understand what has been explained so far.
https://medium.com/analytics-vidhya/easily-build-your-rest-api-with-spring-framework-80941c359d44
['Rafael Martins']
2020-12-18 16:09:02.383000+00:00
['Technology', 'Software Engineering', 'Java', 'Spring Framework', 'Programming']
250
Ten SQL Concepts You Should Know for Data Science Interviews
Ten SQL Concepts You Should Know for Data Science Interviews Study smart, not hard. Design vector created by macrovector — www.freepik.com SQL is extremely powerful and has a lot of functionality. When it comes to data science interviews, however, there are really only a handful of core concepts that most companies test. These 10 concepts show up the most often because they have the most application in real-life settings. In this article, I’m going to go over what I think are the 10 most important SQL concepts that you should focus the majority of your time on when prepping for interviews. With that said, here we go! 1. CASE WHEN You’ll most likely see many questions that require the use of CASE WHEN statements, and that’s simply because it’s such a versatile concept. It allows you to write complex conditional statements if you want to allocate a certain value or class depending on other variables. Less commonly known, it also allows you to pivot data. For example, if you have a month column, and you want to create an individual column for each month, you can use CASE WHEN statements to pivot the data. Example Question: Write an SQL query to reformat the table so that there is a revenue column for each month. Initial table: +------+---------+-------+ | id | revenue | month | +------+---------+-------+ | 1 | 8000 | Jan | | 2 | 9000 | Jan | | 3 | 10000 | Feb | | 1 | 7000 | Feb | | 1 | 6000 | Mar | +------+---------+-------+ Result table: +------+-------------+-------------+-------------+-----+-----------+ | id | Jan_Revenue | Feb_Revenue | Mar_Revenue | ... | Dec_Revenue | +------+-------------+-------------+-------------+-----+-----------+ | 1 | 8000 | 7000 | 6000 | ... | null | | 2 | 9000 | null | null | ... | null | | 3 | null | 10000 | null | ... | null | +------+-------------+-------------+-------------+-----+-----------+ 2. SELECT DISTINCT SELECT DISTINCT is something that you should always have at the back of your head. It’s extremely common to use SELECT DISTINCT statements with aggregate functions (which is #3). For example, if you have a table that shows customer orders, you may be asked to calculate the average number of orders made per customer. In this case, you would want to count the total number of orders over the count of the total number of customers. It may look something like this: SELECT COUNT(order_id) / COUNT(DISTINCT customer_id) as orders_per_cust FROM customer_orders 3. Aggregate Functions Related to point #2, you should have a strong understanding of aggregate functions like min, max, sum, count, etc… This also means that you should have a strong understanding of the GROUP BY and HAVING clause. I highly advise that you take the time to go through practice problems because there are some creative ways that aggregate functions can be used. Example Question: Write a SQL query to find all duplicate emails in a table named Person . +----+---------+ | Id | Email | +----+---------+ | 1 | a@b.com | | 2 | c@d.com | | 3 | a@b.com | +----+---------+ ANSWER: SELECT Email FROM Person GROUP BY Email HAVING count(Email) > 1 4. Left Joins vs Inner Joins For those who are relatively new to SQL or have not used it in a while, it can be easy to mix up left joins and inner joins. Make sure you clearly understand how each join derives different results. In many interview questions, you’ll be required to do some sort of join, and in some cases, choosing one versus the other is the difference between a right and wrong answer. 5. Self-Joins Now we’re getting to the more interesting stuff! A SQL self-join joins a table with itself. You might think that that serves no purpose, but you’d be surprised at how common this is. In many real-life settings, data is stored in one large table rather than many smaller tables. In such cases, self-joins may be required to solve unique problems. Let’s look at an example. Example Question: Given the Employee table below, write a SQL query that finds out employees who earn more than their managers. For the above table, Joe is the only employee who earns more than his manager. +----+-------+--------+-----------+ | Id | Name | Salary | ManagerId | +----+-------+--------+-----------+ | 1 | Joe | 70000 | 3 | | 2 | Henry | 80000 | 4 | | 3 | Sam | 60000 | NULL | | 4 | Max | 90000 | NULL | +----+-------+--------+-----------+ Answer: SELECT a.Name as Employee FROM Employee as a JOIN Employee as b on a.ManagerID = b.Id WHERE a.Salary > b.Salary 6. Subqueries A subquery, also known as an inner query or a nested query, is a query within a query and is embedded in the WHERE clause. This is a great way to solve unique problems that require multiple queries in sequence in order to produce a given outcome. Subqueries and WITH AS statements are both extremely using when querying, so you should absolutely make sure that you know how to use them. Example Question: Suppose that a website contains two tables, the Customers table and the Orders table. Write a SQL query to find all customers who never order anything. Table: Customers . +----+-------+ | Id | Name | +----+-------+ | 1 | Joe | | 2 | Henry | | 3 | Sam | | 4 | Max | +----+-------+ Table: Orders . +----+------------+ | Id | CustomerId | +----+------------+ | 1 | 3 | | 2 | 1 | +----+------------+ Answer: SELECT Name as Customers FROM Customers WHERE Id NOT IN ( SELECT CustomerId FROM Orders ) 7. String Formatting String functions are important especially when working with data that isn’t clean. Thus, companies may test you on string formatting and manipulation to make sure that you know how to manipulate data. String formatting includes things like: LEFT, RIGHT TRIM POSITION SUBSTR CONCAT UPPER, LOWER COALESCE If you are unsure of any of these, check out Mode’s tutorial on string functions for cleaning data. 8. Date-time Manipulation You should definitely expect some sort of SQL questions that involves date-time data. For example, you may be required to group data by months or convert a variable format from DD-MM-YYYY to simply the month. Some functions you should know are: EXTRACT DATEDIFF Example Question: Given a Weather table, write a SQL query to find all dates' Ids with higher temperature compared to its previous (yesterday's) dates. +---------+------------------+------------------+ | Id(INT) | RecordDate(DATE) | Temperature(INT) | +---------+------------------+------------------+ | 1 | 2015-01-01 | 10 | | 2 | 2015-01-02 | 25 | | 3 | 2015-01-03 | 20 | | 4 | 2015-01-04 | 30 | +---------+------------------+------------------+ Answer: SELECT a.Id FROM Weather a, Weather b WHERE a.Temperature > b.Temperature AND DATEDIFF(a.RecordDate, b.RecordDate) = 1 9. Window Functions Window Functions allow you to perform an aggregate value on all rows, instead of return only one row (which is what a GROUP BY statement does). It’s extremely useful if you want to rank rows, calculate cumulative sums, and more. Example Question: Write a query to get the empno with the highest salary. Make sure your solution can handle ties! depname | empno | salary | -----------+-------+--------+ develop | 11 | 5200 | develop | 7 | 4200 | develop | 9 | 4500 | develop | 8 | 6000 | develop | 10 | 5200 | personnel | 5 | 3500 | personnel | 2 | 3900 | sales | 3 | 4800 | sales | 1 | 5000 | sales | 4 | 4800 | Answer: WITH sal_rank AS (SELECT empno, RANK() OVER(ORDER BY salary DESC) rnk FROM salaries) SELECT empno FROM sal_rank WHERE rnk = 1; 10. UNION As a bonus, #10 is UNION! While it doesn’t come up often, you’ll be asked about this the odd time and it’s good to know in general. If you have two tables with the same columns and you want to combine them, this is when you’d use UNION. Again, if you’re not 100% sure how it works, I would do some quick Googling to learn about it. :) Thanks for Reading! And that’s all! I hope that this helps you in your interview prep and I wish you the best of luck in your future endeavors. I’m sure that if you know these 10 concepts inside-out, you’ll do great when it comes to most SQL problems out there. Terence Shin
https://towardsdatascience.com/ten-sql-concepts-you-should-know-for-data-science-interviews-7acf3e428185
['Terence Shin']
2020-10-17 16:40:05.724000+00:00
['Data Science', 'Programming', 'Work', 'Education', 'Technology']
251
Singapore’s First Web AR Experience Powered By Buzz AR
Experience Singapore’s first augmented reality hospitality experience. Using augmented reality, you can interact with “PlanB” in their real-world environment. Otters, Singapore’s favourite wildlife, are making the news again! Using augmented reality on a web-based platform (WebAR), Buzz AR recently launched WebAR to skyrocket the hospitality industry’s engagement rates through its various AR Avatar campaigns. Buzz AR is part of the current cohort 3 of the Singapore Tourism Accelerator, organised by the Singapore Tourism Board (STB) and its appointed corporate innovation partner, F8 Innovation. Buzz AR is Singapore’s first Augmented Reality (AR) startup that has scaled with large enterprises. Existing clients include Fortune 500 companies in China and Singapore. Their flagship solutions provide WebAR (AR Avatar), BuzzCam to drive footfall to businesses. It also provides digital way-finding, gamifies business rewards using an AR Avatar and has its own content creation platform for businesses to track their AR campaign engagement rates for retargeting campaigns. To experience WebAR, simply visit (https://buzzy.buzzar.app) from 24 Dec 2020 onwards. Simple use your smartphone and scan this QR code to access WebAR Adopting a “Digital First” strategy, one of the premium hotel chains in Singapore aims to enhance guests’ experience through digital marketing efforts. Working with STB, the hotel chain has partnered Buzz AR, to bring its AR Avatar to life. PlanB is an initiative that works at developing solutions to future-proof the travel and tourism industry, and help tourism companies thrive amidst the challenges brought by COVID-19. The hospitality industry is one of the most impacted industries during the COVID-19 pandemic. We created this WebAR campaign to help the most affected industry players to navigate this crisis by creating AR Avatars that help humanise brands and businesses. We are thrilled to launch Singapore’s first WebAR for the hospitality industry, without app installation or any kind of registration. It requires just a mobile phone and your default browser. 2020 is the year that WebAR has become available on virtually every up-to-date web browser in the world.” said Ms. Bell Beh, Co-Founder and CEO of Buzz AR. “That said, AR technologies itself have dominated in Silicon Valley since a few years ago, when I was completing the Master’s program as a UC Berkeley scholar. People started to walk into retail stores and hotel chains in San Francisco, expecting AR”, said Ms. Beh. Apart from the WebAR experience, Buzz AR leverages on Big Data capabilities to drive and generate user-centric consented data through Buzz Analytics, a tool given to businesses to track their engagement level. It helps in running re-targeting campaigns to not only increase brand identity but also increase the operating revenue of businesses in the long run. HUMANISING BUSINESSES THROUGH AR “As a society, we are facing a generational crisis and it is crucial to remind businesses that there is a Plan B, for us it is AR Buzzy, what about you? “said Ms. Beh. The Buzz AR team is equipped with decades of gaming and digital marketing experience, with vast experience since the early days of the internet. The technical team built their work from web, app, virtual reality to augmented reality games and is currently working to share their know-how with enterprises in the hospitality industry. Through WebAR, people could get in touch with brands, businesses, from home, or in a physical space. Buzzy was designed to be a joyful, earnest, expressive, and lovable otter. By humanising an AR Avatar that mirrors a 2D mascot for a hotel chain, the Buzz AR team successfully made their brand as Singapore’s first AR company, which creates highly engaging avatars for different businesses. For that reason, through this Christmas season, the team created the PlanB initiative, with the first avatar called “Buzzy” to start engaging with more people around the world, especially within the indoor space as people are staying home to enjoy their Christmas meals, and delighting users with a basketball player avatar “Buzzy” that acts like a human. Buzzy in a basketball outfit AR allows the user to interact with the avatar in real time Buzzy in a basketball outfit Buzzy in a basketball outfit AR allows the user to interact with the avatar in real time Buzzy in a basketball outfit Buzzy in a basketball outfit 3 BENEFITS OF AN AR AVATAR To dig a little deeper, here are 3 examples of how AR Avatar can benefit your business. 1. Increase Brand Engagement From Anywhere You Are Through COVID-19, we noticed that many on-site businesses offer great offers in their physical space. However, many have yet to successfully create a smooth experience when transitioning from online to offline (o2o). Via WebAR, if your goal is to entice more people to turn up at a physical space through o2o engagements, one of the most important elements is to create a lovely brand and start engaging people from anywhere they are, including when they are at home. A trustworthy brand has a history of being engaging, friendly, and is also relatable yet authoritative. A majority of customers tend to be loyal to brands that offer accountability and social proof — e.g. brands that can demonstrate that other customers can like and trust them, these customers will be more willing to become their brand advocates. Building this kind of loyalty does not happen overnight. It takes time and strategy, but having a brand avatar can help. As the embodiment of your brand, an AR avatar can exhibit all of the above qualities in a way that people can quickly relate to. 2. Build A Narrative Your brand narrative exists whether you like it or not. Instead of remaining static, a brand avatar can easily move between contexts (e.g. from print to a 3D dancing mascot with music and sound effect), adapt their appearance and speech, echoing your business’s directional changes and users can interact with it. Leveraging on AR consented data, businesses can build a better narrative (e.g. an AR Gamification which covers the family tree of AR Avatar) to continuously engage with the customers. And who is better to tell this story than your brand avatar? Storytelling is and always will be one of the best ways to connect with people. As the representative of your business, an AR avatar can deliver your narrative in a way that’s creative, unique and exciting — important ingredients to portray an amazing story. 3. Humanise Your Brand If your brand were a person or character, what would their name be? How would they dress? What would they like and dislike? How would they speak? Would they be cheerful and quirky, or down-to-earth and casual? Questions like these will help you imagine a personality for your brand’s avatar. This is important, the more human your avatar is, the more people will be able to connect and trust your brand. The PlanB initiative featuring “Buzzy” is how we turn a static 2D image into an avatar with an interesting backstory. More importantly, as the Christmas season approaches, the team completed the requisite work in just 2 weeks, to serve as one of the showcases. “I’m immensely proud and thankful to our team for the relentless efforts behind PlanB. We aim to bring more AR Avatars to life to delight users, create many more humanised and lovely avatars that can skyrocket businesses to greater heights. We rolled out PlanB despite all the limitations and challenges to accelerate the AR technologies to the mass public. Technologies are evolving every day, but more people and businesses need to know about their options, their PlanB, during this generational global crisis. We have made WebAR affordable, so more businesses can start adopting this technology to enjoy all its benefits”, Ms. Beh said. With 3 billion phones being AR compatible in 2020, we are ready to scale the AR and Big Data technologies with STB and industry partners to accelerate WebAR adoption regionally. To celebrate the arrival of these cutting edge technologies, this Christmas we are introducing Buzzy to the world, the first AR Avatar, a humanised identity that interacts with all users. The Buzzy game will last for 1 month until 31 January 2021 and a lucky winner will be selected randomly to win an iPhone 12 Pro Max. Our fans, “the Buzz”, are encouraged to post their photos with Buzzy on various social media platform such as Facebook, Instagram and TikTok remember to tag @buzzarvr as well. Be it at different tourism spots or simply from home, we want to see them all. What’s Next For Buzz AR? Buzzy is just the beginning, there are more AR Avatars to be rolled out in 2021 with its own WebAR animation in the next 12 months to delight users in Singapore and globally. “We started to work with more industry partners in the hospitality space through our participation at the Singapore Tourism Accelerator and our early AR presence in Singapore to truly propel the country forward in its adoption of emerging technologies such as WebAR. We have also started to accept more exciting projects that are coming for 2021,” Ms. Beh explained. “Amidst all the hurdles, challenges, I remember vividly why I started Buzz AR and left a comfortable corporate counsel job. My personal mission is to deliver happiness to everyone. This generational crisis has brought a lot of distress, tension and separation to many people. If we were to have another round of Circuit Breaker, Buzzy will be here to accompany you. The reason we chose WebAR over other mobile apps is that we want it to be user-centric and accessible everywhere as long as you have a smartphone with you. Anyone can enjoy WebAR, designed to resemble a simple game, a computational image overlays in your real-world environment, but ultimately delivers happiness to everyone.” Ms. Beh said. About Buzz AR Buzz AR, is one of the startups in STB’s Tourism Accelerator, an initiative that works at developing solutions to future-proof the travel and tourism industry and help tourism companies thrive amidst the challenges brought about by COVID-19. The company has undergone J curve growth amidst the COVID-19 hit, with 318% growth in revenues over the past 6 months. It combines AR with Big Data capabilities to engage with users so businesses can better serve their guests with targeted campaigns. For more information, please visit Buzz AR’s website.
https://medium.com/buzz-ar-blog/singapores-first-web-ar-experience-powered-by-buzz-ar-c6295b101517
['Bell Beh']
2020-12-24 04:56:01.723000+00:00
['Webar', 'AR', 'Tourism', 'Singapore', 'Technology']
252
25 tips to prepare your IT infrastructure for the holiday season — Cyber Monday & Black Friday.
Our remote operation teams have been involved in US holiday season traffic for decades. Black Friday and Cyber Monday drive traffic volumes crazy bringing in 100x traffic increases to company websites and systems. It’s the time when some e-commerce & retail companies make most of their annual revenue within a few days time. And I know One thing for sure — That is Murphy’s Law loves the holiday season. Anything & everything that can go wrong will go wrong during this time, and many companies struggle with their engineering teams off during the holidays. US Cyber Monday online sales Our clients, of course, have the luxury of remote teams helping them on preparations as well as continuous 24x7 SRE operations with all tier 1, 2 & 3 support teams fully staffed and operational while the onsite teams enjoy their well deserved time off. US Black Friday online sales Even big brand names had their fair share of holiday season downtimes due to various failures and issues. In these few mad rush hours, a downtime of a few minutes could still cost Millions of dollars to the organization. So here we’re focusing on how you could ensure that your IT infrastructure is well prepared for the holiday season. Below list is without any order of importance.
https://medium.com/@bluecorp/25-tips-to-prepare-your-it-infrastructure-for-the-holiday-season-cyber-monday-black-friday-c16a358f6ba3
[]
2020-11-23 09:36:43.192000+00:00
['Black Friday', 'Cyber Monday', 'Thanksgiving', 'Infrastructure', 'Information Technology']
253
Watch rC3 Chaos Computer Conference With DEFCON 201 :: Dec 27th — Dec 30th
We are proud to announce that we will be live streaming a section of us reacting to the rC3 conference for each of the four days of the convention! The Chaos Communication Congress is an annual conference organized by the Chaos Computer Club. The congress features a variety of lectures and workshops on technical and political issues related to security, cryptography, privacy and online freedom of speech. It is considered one of the largest events of this kind, alongside the DEF CON in Las Vegas. 2020 will be the first year in which Congress will not take place at a physical location due to the COVID-19 pandemic. rC3 will be a variety of distributed small local events in hackspaces with a joint program of streamed talks, online workshops, art, culture and various forms of networked togetherness. If you want to know the schedule you can view it here: Live Streams: Twitch: https://www.twitch.tv/defcon201live dLive: https://dlive.tv/defcon201 YouTube: https://www.youtube.com/channel/UCYDQaOHbK5trRU2CDgb0qSg Facebook: https://www.facebook.com/groups/defcon201/ AGENDA & SCHEDULE Sunday — ALL TIMES ARE EASTERN STANDARD (EST) 12:00 NOON Hacking the Nintendo Game & Watch 1:00 PM Accessible input for readers, coders, and hackers 2:00 PM Hijacking Transient Execution through Microarchitectural Load Value Injection 3:00 PM Lecture: Hacking German Elections 4:00 PM The Yes Men from Tricksters in an age of dirty tricks 5:00 PM BONUS :: Lecture: Tracking Ransomware End-to-end Monday — ALL TIMES ARE EASTERN STANDARD (EST) 11:00 AM close encounters @ rc3 # 2 12:00 NOON Milk Tea Alliance, let’s make protest cute again! 1:00 PM Attacking CPUs with Power Side Channels from Software: Why is electricity leaked here? 2:00 PM Assault in office by police officer: inside 3:00 PM Spot the surveillance 4:00 PM Lecture: remote Stellwerk Experience 5:00 PM Turning Facial Recognition Against the Police 6:00PM EST BONUS :: Fuzzing the phone in the iPhonet Tuesday — ALL TIMES ARE EASTERN STANDARD (EST) 11:00 AM close encounters @ rc3 # 3 12:00 NOON Unconventional HDL synthesis experiments 1:00 PM Twilight — MIDI Broadcasting Ice Sculpture 3:00 PM Hacking Diversity: The Politics of Inclusion in Open Technology Cultures 4:00 PM OPENCOIL — A Roaming Speedshow 5:00 PM BONUS :: Ramming Enclave Gates: A Systematic Vulnerability Assessment of TEE Shielding Runtimes / iketohear — Self-Adjustment of Open Source Mobile Hearing Aid Prototype 6:00 PM Lecture: “My Whores Manifesto” — online reading and conversation with Undine de Rivière Wednesday — ALL TIMES ARE EASTERN STANDARD (EST) 9:00 AM The state of digital rights in Latin America 10:00 AM Watching the Watchers — How Surveillance Companies track you using Mobile Networks 12:00 NOON Lecture: Inside xHamster 1:00 PM close encounters @ rc3 Wrapup & Q&A
https://medium.com/@defcon201/watch-rc3-chaos-computer-conference-with-defcon-201-dec-27th-dec-30th-68a9a33041c0
[]
2020-12-27 05:35:00.075000+00:00
['Hacking', 'Activism', 'Charity', 'Live Streaming', 'Information Technology']
254
How can blockchain disrupt the insurance industry?
The insurance industry hurdles to control, but blockchain’s capacity to give absolute responsibility, transparency and higher security will help insurers save time and money, as well as enhance client satisfaction.Today we are moved to discuss a popularly trending topic “Blockchain technology” and how it assists the insurance industry. Blockchain technology has before started to transform more than 19 industries. Now we will discuss everything in detail in the blockchain role in the insurance industry. Moreover I assure you that, when you finish reading this entire article, you will get knowledge about How Blockchain could transform the insurance Industry And its benefits. Now let’s know about challenges of insurance industry The crucial dare that insurance industry faces in current days are as follows : 1 . The restricted growth in markets for the insurance industry. 2 . Participation of fraudulent assert activities. 3 .Event of third party payments and transactions . 4 . Difficulties in holding huge amounts of data . There are a lot more challenges in the insurance industry and the above mentioned major challenges can be controlled by Blockchain Technology . Blockchain — A key to Escape out of these Challenges Blockchain in the insurance industry can help to boot out most of the matter in the insurance industry and make it work successfully. Now let us look at how the blockchain saves the insurance industry from basic issues like stability and trustlessness. Blockchain in insurance can give trust , stability and transparency of data which creates trustworthiness in insurance sectors among persons. Here we discuss the key to escape of these challenges in insurance industry Security As blockchain utilize a public ledger which can withstand duplicate transactions by observing each transaction. With the decentralized digital system , blockchain can prove the authentication of clients, policies and transactions by continuing historical records.It creates a complication for hackers to steal data . Smart Contracts The customized smart contracts in the insurance industry return material paper documents and venture by smart contracts that connect real-time records. This easily begins claims and payments with high accuracy which in turn reduces money and time. Third — Party Transaction The amount of third-party transactions decreased by blockchain. And some assert and management costs . Blockchain assist for automated confirmation of assert and payment details from third parties. With the use of blockchain , all insurance businesses can easily recover past declare and transactions that are recorded. This outfit is trusted among insurers and policyholders . Big Data The large amount of data stored by blockchain . The benefit of blockchain can store a large amount of static data /records without central authority and the data can be viewed by all individuals attached in blockchain.This kind of smooth records can make transactions more accurate and risk evaluation. Reinsurance Blockchain can provide correct preservation calculation details on currently processing contracts , in insurance sectors. Above mentioned a part of the blockchain key to escape the challenges in the insurance sector. Now, we discuss about what are the major benefits of insurance industry via blockchain The 5 real blockchain benefits in the insurance industry are- Fraud detection IoT & Blockchain together to structure data Multiple risk participation/Reinsurance On-demand insurance Microinsurance Fraud Detection Blockchain ensures that all the skilled transactions are lasting and timestamped. I.e. no one, including insurers, can modify the data preventing any kind of contravention. This data can further help in defining patterns of fraudulent agreement, which insurers can use in their fraud prevention algorithms. Fraud observation utilize blockchain: Powered by smart contracts, Etherisc separately confirms assert by using multiple data origins. For example, for crop insurance claims, it compares satellite images, weather reports, and drone images with the image given by the applicant. IoT & Blockchain together to structure data This data is extremely valuable for insurers to develop exact actuarial copy and usage-based insurance models. Considering the auto insurance industry, the data is calm about operating time, interval, acceleration, breaking patterns, and other behavioral statistics that can recognize high-risk drivers. But, the question is — how to manage the huge data as millions of devices are interfaced every second. Multiple risk involvement/Reinsurance Reinsurance is insurance for insurers. It safeguard the insurers when large volumes of claims come in. Blockchain can bring twofold advantages to reinsurers. One-unbreached records for correct assert analysis and two — speeding-up the process through automated data/information sharing. PwC estimates that blockchain can help the reinsurance sector secure up to $10 billion by improving operational efficiency. On-demand insurance On-demand insurance is a supple insurance model, where policyholders can turn on and off their insurance plan in just a click. More the interplay with policy documents, the greater the inconvenience to manage the records. But, thanks to blockchain technology, maintaining ledgers (records) has become simpler. On-demand insurance players can leverage blockchain for efficient record-keeping from the inception of the policy until it’s ejection. It provides blockchain-powered insurance platforms to B2B insurers to transfer risks quicker and more transparently. Microinsurance Instead of an all-encompassing insurance policy, microinsurance offers confidence against specific hazards for regular premium payments, which are far less than regular insurances. Microinsurance policies bring gain only when dispensed in huge volumes. However, because of low profit-margin and high distribution cost, despite immediate benefits, microinsurance policies don’t get the deserved grip. Blockchain can offer a parametric insurance platform. With this, insurers will need little local agents and “oracles” can replace adjusters on the ground.
https://medium.com/@brugusoftwaresolutions/how-can-blockchain-disrupt-the-insurance-industry-2270f27ee4b
['Brugu Software Solutions']
2020-11-20 17:45:05.140000+00:00
['Blockchain In Insurance', 'Blockchain Technology', 'Blockchain Development', 'Blockchain Startup']
255
No Fighting In This (Agile) Dojo with M. David Green
No Fighting In This (Agile) Dojo with M. David Green Episode 46 How can we train teams to consistently produce quality code without negatively impacting productivity? In this episode of Programming Leadership, Marcus and his guest, M. David Green, discuss Agile Dojos and how they can make teams more effective. Dojos provide a six-week training ground where teams focus on recognizing and replicating value by pairing, mobbing, and swarming. Coaches like Green help them to hone their skills and go through rituals more effectively. The results will be more engaged team members, scrum masters, and a way of working that converts skeptics and naysayers into Agile evangelists. Show Notes What is an Agile Dojo? (00:53) Recognizing value (6:09) What kind of work are engineers actually doing in a six-week dojo? (10:05) How David’s dojos differ from others (18:30) Understanding extreme programming (XP) and why it’s valuable (23:41) Engaged scrum masters are essential for long-term change (26:49) Convincing skeptics to try a new system (32:12) Defining “mobbing” and “swarming” (35:00) Why pairing doesn’t negatively impact productivity (38:47) Links Dojo Consortium: https://dojoconsortium.org/ Extreme Programming: http://extremeprogramming.org Hack the Process Podcast: https://www.hacktheprocess.com/ Scrum: Novice to Ninja: https://www.amazon.com/Scrum-Novice-Methods-Powerful-Development/dp/0994346913 Programming Leadership Podcast: http://www.programmingleadership.com Transcript Announcer: Welcome to The Programming Leadership podcast, where we help great coders become skilled leaders, and build happy, high performing software teams. Markus: Welcome to the episode. I am so pleased to have my new friend M. David Green join with me today and we are going to talk about Agile Dojos. How cool does that sound? David, welcome to the program. David: Thank you, Marcus, it’s great to be here. Markus: I was really looking forward to this episode because I recently watched the original Karate Kid and I was thinking about, Daniel-san, and the moves and the wax on, and I realize it’s silly, but I kind of like kung fu movies and stuff, but let’s be serious here for a minute. What is an Agile Dojo? David: Well, you’ll excuse me if I don’t want to be serious, but will be sincere if that’s okay with you. Markus: I think that’s wonderful. [laughing]. David: Sure. So, an Agile Dojo is an opportunity for a team that has an interest in developing their Agile skills to practice those skills under the supervision of a coach who can help them learn, and go through the rituals more effectively, and develop their skill around these things. Markus: Can you give me an example. What kind of skills — if I go to a regular dojo, and one time I attended a karate class, even though it wasn’t a great experience — I remember there was lots of punching and kicking and sweating and some yelling and stuff like that. I’m guessing this is different. So, what kind of skills might one learn in an Agile Dojo? David: You’re setting me up to tell you about all of the punching and kicking and screaming that happens in my dojos. [laughing]. But there isn’t really all that much. So, the concept of a dojo does come from the martial arts, it comes from Aikido. It’s the concept of a place for practice, the word “dojo” just means a place where you can practice. And in an Agile Dojo, typically a team comes together, either because there is a specific development skill that they want to practice, such as, for example, test-driven development, or perhaps they want to practice how they start and stop their programming sessions or how they pair program. There are a number of different engineering practices that you can work on in a dojo environment. Markus: So, I’m thinking concretely because you mentioned some great examples. So, let’s take one of them, I think you said unit testing, is that right? Your test-driven development skills, if you want to get better, a company could create an Agile Dojo, and that might be a skill that people practice — a place of practice. Is it actually a different physical room? Is that a state of mind? What is the dojo? David: People have had just a lot of discussions about whether or not a room is necessary. I found that a room is actually very useful, although when I started coaching I generally tended not to use a room and I tended to prefer to work with people in the workspace where they naturally work because the work that they’re doing inside the dojo then translates more easily back to the type of work that they normally do. But as I’ve started working with a set of practices, and in particular, I run a dojo around Extreme Programming, which combines together a number of Agile practices, I find that having a room where a team can isolate themselves from the distractions of the rest of the work floor, gives them the opportunity to focus in more on what they’re doing, without concern about being pulled away for other projects or about communicating with other people or closing themselves out from different distractions that might come from the floor. The factor of having a room isn’t necessarily something that all teams end up with, but I’ve noticed that at least a third of the teams that I’ve worked with, end up choosing, in the future, to find rooms, to reserve conference rooms or whatever, and in companies where they don’t normally have their own desk space in a private room so that the team can continue to work together in that isolation. Markus: Okay, this sounds like — so I’m imagining a company who might be listening. Maybe you’re a software manager, and you say, “Man, I’ve been trying to get my developers to do TDD for years. And it’s so spotty, I’m not happy with it, they resisted. Maybe a dojo is right for me.” Is there some ways that a company, or a manager, or anybody can start to play or experiment with this idea? David: Well, I think the first question I would ask a manager in that position is why do you want your engineers to do TDD? What is it that you really want to accomplish? Because that’s the beginning of the conversation that leads to the possibility of whether or not a dojo should be part of your plan. If you are just trying to implement TDD as part of what your organization does because you read somewhere in a board, or an expert told you, “If you’re not doing TDD, then you’re not doing it right,” they probably haven’t thought through the issues enough to really understand whether or not you want to try to bring people into a dojo and develop these practices. One of the reasons that I encourage people to go into a dojo is because it allows a team to recognize the value of what they’re delivering, not just change their practices. If you try to enforce TDD, for example, test-driven development, from the outside, it’s not going to stick. Most engineers know how to write tests, and if you tell them, write the test first, they know how to write the test first. Just because they can demonstrate to you the ability to write a test first doesn’t mean that that practice is going to become part of the culture of your company. The advantage of a dojo is it gives a team the opportunity to work with these practices for an extended period of time, often about six weeks. And once they’ve had a chance to experience, viscerally, the benefit of these practices, they’ll tend toward them when they’re appropriate, and often teams will tend to use them in preference for the way that they were working before. Markus: So, what are some of those visceral experiences, these recognizing value, I think you said? David: Yes, because basically you’ve gotten a team of engineers who might be thinking about how quickly they can deliver projects in order to evaluate how effective they are as engineers. And you might even have management looking at the engineers that way, as well. I’m of the opinion, and I think a lot of people in the industry would agree with me, that from an engineering perspective, the focus shouldn’t necessarily be on how quickly something is delivered, but rather how much quality is put into the process, how well the product is developed, how easy it is to maintain, how resilient it is, how robust it is, how easy the code is to read and understand. These are things that an engineer is qualified to work with and understand better than anybody else in the company. Whereas you might have people, say in the product organization, who might be better at focusing on the delivery speed and making sure that things meet customer expectations around deadlines and requirements. By delegating the responsibility for delivery off to product and letting the engineers focus on quality, you have the opportunity for the engineers to look at the work that they’re doing, not from the perspective of, if I don’t get this many things done by the end of the quarter, then I’m going to get a bad grade and a bad rating, but rather they look at, is everything that I’m doing of the highest quality? That is, is it maintainable? Is it not going to blow up in production? Is it going to work effectively? Is it understandable? And that’s really what an engineer is his best at. Markus: You used a really interesting phrase there; the highest quality. And I have to say I think that most engineers really want to build high-quality products. I don’t know of any that would say, “My goal is to build low-quality products,” but sometimes the frames and the framing of quality can be different between management, product, and engineering. That is it can mean different things to different people. Do you have a useful definition that you encourage for the engineers around quality? David: Well, you might have noticed that I was using a couple of metrics that you could include, to evaluate whether or not a software project — we’re talking software engineers — whether that project is being done with quality. And the maintainability of the code, the ability for the next engineer who looks at it six months later to read it and understand it, its robustness, the fact that it doesn’t fail easily in production, the fact that it has a lower number of bugs, or that it’s easier to restart when it does stop, or it’s easier to understand and fix when there’s something broken, the fact that it’s possible to toggle on and off features depending on the needs of a product organization. There are a number of different ways that engineers can look at their code and say, “This is higher quality and this is lower quality.” That said, as you said, the management might have different criteria for evaluating the value of the software that’s being delivered to the company, and product might have different metrics that they use for evaluating these things. When I talk about quality, I’m talking about the observable quality that an engineer would attribute to the code. Markus: Okay. Do you find that most companies are having these conversations about different perspectives on quality? David: I don’t think that most companies are as enlightened as you are about that point. [laughing]. Markus: Oh, I don’t know about that. [laughing]. [crosstalk 00:09:35] David: [crosstalk 00:09:35] I’m of the opinion that most companies would prefer to use a word that they think everybody understands and just let that word apply universally rather than dig into the semantics of this particular person looks at it from this lens and this particular person looks at it from that lens. That said, I think that most companies would be willing to have that conversation, I just don’t think it’s occurred to everybody. Markus: Hmm. Well, let’s talk about how the dojo might help that. So, in your dojo, it sounds like — do you create some explicit ideas about what quality looks like in the dojo? David: So, what I do in my dojo is I have a conversation with the engineers. I tend to work with engineers who are working in a Scrum model, which means that they tend to work in sprints, usually about a two-week sprint, that’s typical of most of the companies that I’ve worked with, and the way that I introduce the concept of quality and what an engineer might look at in terms of evaluating the work that they’re doing is by having a dialogue with the engineers in which I introduce a number of different facets of code and of coding practices. And I ask the engineers to evaluate themselves, and to look at the work that they’ve been doing and think about whether or not this is objectively the way that they think the work should be done, and to rank themselves and to rate themselves. We have this conversation at the very beginning of a dojo. We have this conversation after the retrospective, after each sprint. And then we have the conversation again after the dojo is completed, to see if the engineers have continued the practices after they’ve graduated. Markus: What kind of differences do you notice in the way they, maybe, rank themselves or the way they see their own work? David: Well, one of the interesting things about rankings is as soon as you put a number in front of something, people are going to try to game it. And engineers, just like anybody else, they tend to try to give themselves the highest score that they possibly can around everything. What I’ve noticed is that the teams that I’ve worked with that have the greatest maturity by the time they finish the dojo are the ones who look at the ratings that they gave themselves the highest score on before, and they rank themselves lower by the end, and they say, we actually learned something about what this means, and now we’ve realized that probably there’s more room for us to grow around this. Markus: Oh, wow. Okay, so you mentioned that when you had started, or it sounded like early on in your work as an Agile Coach, the dojo wasn’t so much of a room as it was you were in with — where the engineers currently were working. But now we have sort of the idea of, they’re in a room working, this has worked well for you. What are they working on? Are they just working on katas? Which I guess is another martial arts term, right? Or forms? Or are they — like what is the work that they’re actually doing during this six-week intensive dojo time? David: [laughing]. I can tell you from my dojos, I prefer to have the teams working on whatever projects they were working on when they started. Wherever they were in their projects and whatever they were working on. I don’t like to take them away from the type of work they’re familiar with, or even the languages that they’re working with. I’m not there to teach them things that they don’t already know. I’m there to give them the opportunity to practice skills in the context of what they already know. That said, the concept of an Agile Dojo is usually more broadly applied, and often, it will be focused around building a specific skill. A lot of people who run dojos will do an entire dojo specifically on pairing, and they’ll teach pairing practices, and they’ll work on the logistics of pairing, and the rhythm of pairing and getting people familiar with that. Some people run dojos around test-driven development, and it’s about how do you run a test? How do you get into a cycle where you make sure that you’ve written a test before you write any code, you make sure that the test has failed, you make the test pass in the simplest way possible, and then, once you’ve got the test passing in the simplest way possible, you go in and you refactor your code before you start writing your next test, and you do your development around that cycle. Getting people familiar just building muscle memory around that; that has value and that can be done with either katas or with the code that a team is normally working on. For me, I prefer to work with the code that a team has already got. Whatever projects the team is working on, I bring in the Scrum Master, I bring in the Product Owner, I make sure that everybody’s on board with it, and that way we are building these skills in the context of something they can take away with them, so that they can continue in that way once they’ve left the dojo. Markus: Now you brought up these other two roles. I’m curious, do they have a role to play, the Product Owner and the Scrum Master? Are they a part of the dojo, or kind of just incidental? David: So, I’m working in a Scrum and XP model, and Scrum and XP have some similarities, but most of the engineering teams that I’ve worked with would say that they are following Scrum. Scrum uses the concept of a development team of a couple pizzas worth of engineers, enough engineers that it would take to eat two pizzas for lunch, plus one Scrum Master and one Product Owner. The Scrum Master is responsible for keeping the ceremonies going effectively for Scrum, and the Product Owner is responsible for representing the customer and helping to defend the team from work that would take them away from work that would support that particular customer. In my dojos, I look at those roles, the developers, the Scrum Master and the Product Owner, as all of the elements of a team, and I’m coaching the whole team at once. Markus: Okay, so I’m now — I stammering around because I’m — my mind is sort of swirling. I’m imagining a TDD dojo, or an XP dojo — I guess, with an XP dojo, it does make sense that the Scrum Mast — the other roles are in there, but when I think about a pairing dojo, would that involve other kinds of roles, like non-coding roles? David: So, I don’t do targeted dojos around one specific practice the way that I’ve described. As I said, other people do have katas around TDD, and they might spend, for example, two hours a day for five days a week specifically focused on that, and that would be what they would call their dojo, or they might pull the engineers off completely, and work exclusively for several weeks on certain practices. I work on Extreme Programming, which would involve bringing all of those practices together, but working with the entire team on whatever work they’re normally doing. Markus: Okay, got it. All right, so it sounds like that even the term dojo has a lot of flavors. How did you get into this kind of work? David: So, I was a developer myself, and I was a senior front end engineer, working at companies — I was working from Apple Computer, I was working at a bunch of startups in SoMa, but I also have an MBA in Organizational Behavior. So, I had a different perspective on things. I came to engineering rather late in my career. And as an engineer, having had the experience of working before that in other roles, I saw that the engineers I was working with were really suffering. I saw a lot of pain in the process in terms of how communication happens, how requirements are explained, and set, established, how deadlines are met and how they’re handled, a lot of things that made me uncomfortable with how engineers were experiencing their work, and I wanted to help with that. And I recognized Agile — probably about 10 years ago in my career — as something that promised to help with that, if it were properly implemented. And I was working at a couple of companies that said they were following an Agile process, and that gave us classes that taught us what Agile was supposed to represent, then I didn’t actually see that in practice. I wanted the opportunity to help with that. And I made a transition in my own career. Went from being a platform developer to being a Scrum Master. I rather quickly went from being a Scrum Master to being a Program Manager, organizing Scrum Masters around the company that I happen to working at at the time. And I fell in love with it. I saw so much benefit to the people that I was working with. They brightened up, the work improved, the quality of what they were delivering improved, the communication improved, and I thought this is really what I want to be doing. I want to be helping people get more of this in their lives. So, I recognized that, in the time that I was working, and the term for that was Agile Coach, and I started transforming myself into an Agile Coach, started working with small companies, working with independent clients. Right now, I’m working with enterprise clients. It’s been a really wonderful ride. Markus: So, if somebody’s listening, and they are really intrigued about how they might use an Agile Dojo where they work to build skill, to build collaboration, it sounds like maybe if they’re XP or Scrum, maybe the whole team can be in there and have something to learn. How would you suggest someone start? David: Well, there are a number of coaches out there who work in the dojo model. I’m actually part of a consortium, called the Dojo Consortium, as a matter of fact, which people can go visit and find out about some recommended practices. Companies such as Target and Verizon and Delta and Walmart, there are some really big companies out there which have adopted the dojo model for their engineers and had great successes with it. I recommend people go to dojoconsortium.org for a quick look at how these different dojos are approached. Markus: Nice. Okay, well, I want to dive into a couple of specific questions about dojos. And I understand that you do dojos differently than some other people do. David: Mm-hm. Markus: So, I’m curious, when I went to a dojo or a place where people were learning a martial art, there were some rules, and they had the rules on the wall, and they said take off your shoes, and you have to treat people in a certain way. What kind of rules, or guides, whatever you want to call them, might you like to see in a dojo? David: Well, one of the things I tell people when they’re entering the dojo is, there’s this concept also from Aikido called Shuhari, which is about a progression of learning and practice. And the concept is that there are three stages of learning in a dojo, the Shu stage, where you are simply adopting traditional forms and following those traditional forms as closely as possible to get familiar with them, the Ha stage, where you have incorporated those forms, and you’re starting to practice them and work with them yourself independent of a coach, and then the Ri stage, where you might be able to transcend those forms and move beyond them. The entire dojo is handled in the Shu phase of the Shuhari. So, I come up with a set of practices that I encourage people to follow, and among them, I like the team to establish what their core hours are when they’re going to be available to work with each other. That’s important because I like the team to pair and mob on all of the work that they do, rather than work independently. No silos, no individuals working on a piece of code. If one person is working on something, everybody on the team should be aware of what it is and should ideally be able to step in and help with it. I like the team to be doing test-driven development around all of the code that they’re building so that they know how to do that test first process. I like them to use acceptance test-driven development, where the Product Owner can define the behavior of the end product, of the software as it’s supposed to be delivered, and then based on those definitions, the team can write acceptance tests, which also can fail and then be made to pass through the process of development. I have the team sitting together in a room. That’s usually unusual for them. That’s not something that they’re familiar with, and often I find the first thing people say is “Why do we have to be locked in this room all day?” But by the time the dojo is over, I’ve yet to find a team that didn’t want to adopt the room and stay in there forever. And I sit with them. I sit with the developers for the whole time that they’re doing their development work, and I coach them as they go, and I try to be rather lightweight about it. But I’ll remind them if they start working on code without having first written a test, “Did you write a test for that first?” Just like asking the question, and not necessarily forcing them to do anything that they wouldn’t do, but if they recognize that there’s value to what I’m suggesting, they’ll pick it up and they’ll start doing it. Markus: That’s a nice, gentle nudge towards what you’re looking for. So, I’m curious, have you found anyone who is using Agile, these Agile Dojos in a fully distributed team? David: So, before I started doing dojos, myself, I actually was coaching teams that were fully distributed at one company. I worked at a company, it was a small startup, and they had no physical location at all. Everybody was spread all around the world. And for example, there was one team that had people from Portland to Pakistan. And — Markus: Wow. David: — it was challenging, for example, getting them to work around the concept of having a daily standup because getting a schedule that works for everybody was interesting. But finding a time when the teams could work together effectively was one of the big challenges there. We ended up encouraging a lot of pairing between the people who were in shared time zones. But remote pairing is definitely not a problem, as long as people are able to align their time effectively. I like working with distributed teams a lot because working in a distributed environment forces you to have the kind of transparency and documentation that keeps everybody on the same page at the same time. Markus: Hmm. I like that answer. David, we’ve been using some terms like Extreme Programming, also known as XP or Scrum. And I know that for myself, I’ve been assuming that audience members just know what that means, as well as Agile, but you seem to have a particular affinity for this thing called Extreme Programming, which sounds pretty extreme. What is it? And why, in your mind, is it valuable? David: Absolutely. Extreme Programming is something I’ve seen work for so many different teams. And one of the things that I noticed when I started getting into Agile, I was going into it because I was looking for ways to reduce the suffering and increase the joy that I saw in the teams. The teams that seemed to have the most joy were the ones that were following an Extreme Programming practice. And when I first heard of it, I wasn’t exactly sure what it was, and I had a lot of fantasies about what it might be, but it’s a fairly simple concept. With Extreme Programming. There’s a set of fundamental practices, and I’ve mentioned a few of them along the way. Teams that are doing Extreme Programming pair or mob on all of the work that they’re doing. They do test-driven development for all of the coding that they’re doing. They use acceptance test-driven development to make sure that the stories that they’re working on from the Product Owner deliver real value and are framed around real customer needs. And another factor of Extreme Programming that I think Extreme Programming inherits from Agile is this concept of reflection, where you work in short iterations, and get the opportunity after each short iteration, to look at what you’ve done, and look at the way you’ve done it, and reflect on your process, and then improve the way that you might do things the next time. Markus: So, this reflection thing. I’m keen on it because I’ve been thinking lately about how to build learning teams. Is reflection the same as just a retro? Everybody talks about doing a retro, but I don’t know if XP reflection is just their version of a retrospective. David: I believe that it is, and that’s the way that I approach it. And in fact, because the teams that I work with tend to use a Scrum model as opposed to an Extreme Programming model when they start I tend to build the Extreme Programming practices inside of Scrum. I’m not a purist around this. So, if the team already has a set of ceremonies that they follow, and they have a person they call a Scrum Master and the person they call a Product Owner, that’s fine with me. That’s different from what Extreme Programming would require, which is a person called an XP Coach, and another person called a Customer with slightly different names for some of the ceremonies. For me, I don’t see that the differences are all that important. So, yes, the retrospective, from a Scrum perspective, with the important caveat that a lot of teams may not be doing their retrospectives or maybe passing them off too quickly, or may not be taking action items out of them. And I like to make sure that the teams I work with recognize the value of that reflection so that they can benefit and improve. Markus: I’m starting to get kind of a picture here. You don’t have them do katas or forms. These people come into the place; they say we’re going to spend six weeks together. We’re going to be in a different kind of container where we’re kind of locked away. And we’re going to work together. But we’re going to work on what we were already working on. So, the product and the codebase and the tooling is very familiar, but now we’re also going to try and be more intentional about how we write, and who we write with, and how we know it’s correct, but also how we build in learning. And I think that last bit kind of lights me up because I do talk to so many companies and developers that tell me, “Oh, yeah, we have retros. Nothing really ever changes.” But in the dojo, how do you spur that on so that you create some change? David: So, the dojo works best when there is an engaged Scrum Master, and this is really one of the places where a strong Scrum Master has the opportunity to shine. When I start a dojo with a team, sometimes a Scrum Master might think, “Are you criticizing the way that I’ve been doing things? Do you think that I haven’t been doing things right?” In fact, the Scrum Master is essential element to an effective dojo, and I consider the Scrum Master to be the coach that the team gets to keep when they graduate from the dojo. So, I empower that Scrum Master, and I offer techniques and tools and tips and opportunities. Usually, it’s a very collaborative process, and by bringing the Scrum Master into the process and making sure that the Scrum Master’s vote is part of every vote, and their ranking is a part of every ranking, it cements that person as part of the team and not just somebody off to the side who forces us to do ceremonies every now and then, whom we can otherwise ignore. Markus: Yeah, I think I could imagine that the Scrum Master might feel like, “Well, this is happening because I wasn’t doing my job right.” Or, David’s here, “It must mean something’s wrong because we’ve been sent to the dojo.” I don’t know if people get sent to the dojo or if they sign up for it, but at some point they’re there. And I’m curious how other roles might react to being in this environment? David: Well, it’s a funny thing, because the very first team that went through the dojo in my current engagements was very enthusiastic about going through, and that set the tone for the rest of the company honestly. Right now, I’ve got a waiting list of teams that are waiting to go through. They’re enthusiastic and anxious about the opportunity. And it’s not because they think that they’re going to be evaluated. It’s because they see that the teams that have gone through it before have gotten so much benefit out of it. There’s a lot of evangelism that comes from this process. And the developers evangelize it among the developers. The Scrum masters evangelize it among the Scrum Masters and the Product Owners evangelize it as well. The manager is evangelizing it because they’re seeing the benefit to the teams and their ability to work and deliver high-quality code. Markus: So, individuals, it sounds like, come out after six weeks, changed. How do team dynamics change? Or maybe, how have you seen them change in that six weeks? David: Yeah, it’s funny, because I don’t think so much about the individuals changing. I do coach the individuals as I see people who might have specific problems, but my focus is always on the team dynamics. It’s always about how the team works together as a whole. That’s one of the reasons why pairing and mobbing is so elemental what I’m doing. If everybody on the team isn’t fully engaged in the process all the time, then the team isn’t really working together effectively. That’s one of the reasons why I encourage teams to establish core hours that are maybe four or five hours a day at most, because it’s very intensive work when you’re pairing and mobbing with people, and you have to be engaged constantly in what’s going on. You need some downtime. You need time to answer your emails. You need time to take training courses, you need time to do personal development, but you also just cannot work that many hours a day doing that kind of thing. Markus: I remember, I think it was about 2004, I picked up this wonderful little white book called Extreme Programming Explained, maybe or something like that it was Mr. Kent Beck — David: By Ken Beck, yes. Markus: Yeah, Mr. Kent Beck. And I remember getting so excited. So, excited. I read it on a business trip on a plane, I thought it was going to be the miracle that saved the company I was at — not saved it, but like — I love the — I want to go back to something you said. It might bring more joy because there was a lot of suffering, so much suffering. And I went to the developers, and I said, “Hey, what if you guys pair programmed everything together? Like, this book says it’s going to be great.” And they went, “No way.” And then I said, “Okay, well,” I went to my manager, and I said, “What if people pair programmed? It’s gonna be great.” And he said, “No way.” So, I found resistance on all sides. And, frankly, I just kept being told, “This is a really dumb idea.” And I’m curious because, David, clearly you are a better evangelist than I was, I could get no traction with these fine people who were suffering. But, how do you start to convince both the upper management — or any management who looks at people as, like, “Won’t that half our productivity if they’re working in pairs?” And the developers who say, “No, this is individual, I need to work with my headphones on.” How do you start to change both of those mindsets? David: They’re very different discussions, honestly, but they do both come down to the same concept, which is this concept of quality that I mentioned early on. Once management has their head around the fact that engineers are not code monkeys trying to churn out code as fast as possible, which results in a lot of technical debt and a lot of failures down the road, and once engineers understand that their focus is on the art and the craft of what they’re doing — and they want to create something beautiful, and something that works, and something that’s understandable, and something that can be proud of — you can convince people to try something different because what they’ve been doing in the past has always resulted in the same bad experiences and the suffering — Markus: The suffering, yeah. [laughing]. David: — the unpleasantness. [laughing]. If you give them the opportunity to try something new and say this is going to be just for a few weeks, we’re going to try this. Give it a try, see what you think of it. And then let them evangelize it themselves, if they liked what they got. Markus: Do most of the teams that come out of the dojo continue with the pairing practice? David: Most of them continue with it partially. Very few continue with it fully. And some teams adopt one or the other, either pairing or mobbing and they stick with that. I’ve had more teams actually stick with mobbing consistently than with pairing, which was a surprise to me because one of the first things that happens when the team does the dojo is they say, “Okay, we had five engineers, that meant we could work on five stories at once, so we got a lot of work done. Now you’re telling me that, first of all, we’re going to pair, which means we can only work on two stories at a time because we have two people in one pair and maybe three people in the other hour. Or maybe we’re only going to work on one story at a time because we’re going to mob all the time. How are we going to get all of our work done?” Once they get the experience of noticing, oh, with everybody’s brain engaged at the same time, and with the ability to rely on this person who was otherwise working on this thing that nobody knew about, and to bring that person’s brainpower into the process, and get everybody engaged, we’re getting such better quality that we’re producing. We’re improving the quality of the work that we’re doing, so we can work faster and we can work more effectively together and more efficiently. They start to notice the benefit of this and, as I say, I’ve had more teams stick with mobbing even though it has a lower work in progress limit, because they just find it so gratifying to work together that way. Markus: Do a quick definition for me. What is mobbing? I hear mobbing and swarming as things that get talked about, so, tell us what mobbing is. David: Sure. So, you’ve got the basic concept of pairing to start with, pairing being two engineers working together with one screen and one keyboard, and they’re both looking at the same code. One person is driving, that is the person that the keyboard, the other person is navigating, instructing that person on what to do, keeping track of things while the first person is typing. Markus: Two people, one computer. This is fundamental. Not two computers side by side. One machine to humans. David: One machine. Yeah, my teams will tease me because one of the things that I encourage them to do is to shut their laptops if they’re not the one who’s actually driving. And I will play games with that, but it’s important because if you’ve got somebody sitting across the table from you, even if that laptop is off, you don’t know that that laptop’s off. It looks like it’s open; it looks like it’s a distraction. Markus: Okay. David: And as soon as that laptop opens up, you’re going to be checking your email, you’re going to be getting instant messages, all sorts of things are going to come up. Very distracting. But yes, fundamentally, two people one computer. A driver on the computer, navigator looking at the screen and giving feedback on the process. Mobbing expands that so that you have one person on the keyboard, and then, perhaps, the entire team all looking at the screen at the same time, giving feedback, giving context. If there’s something that somebody doesn’t know, or that anybody doesn’t know, they can look it up on Google, right there on the shared screen together. It’s not like somebody has to open up their laptop, and look it up separately, and then report back. If there’s a resource or a piece of information that people cannot get on that one shared screen, that’s a liability to the team because if that person who had to look that up weren’t there, there’d be no other way to find that information. If it’s stuck in notes that that person wrote to herself on her own machine somewhere, and that person won the lottery and left the company, there’d be no way to get that information otherwise, and so that’s why would you encourage everybody to be looking at one screen together. So, the mobbing concept basically takes pairing and expands it out to the entire team. And you mentioned another term: swarming. I use the term swarming to mean what you might see when you have a roomful of engineers, all with their laptops, open all working on something independent, but working together on the same concept. Sometimes that’s useful in a programming environment. Sometimes, we want everybody to go off and research something for 15 minutes and then come back with a result. The key thing in that is that 15 minutes, so it doesn’t become the entire day. It just becomes, we have a specific objective and we’re going to swarm for the next 15 minutes in order to find this information. Everybody, open up your laptops. Everybody, go look out, look around, do research things. What you’ll find is people are often looking in the same place, they’re often looking at the same thing, they don’t know the person next to them is redundantly looking at the exact same thing, probably could have saved time by everybody looking together at the same screen, but they prefer to do it that way. It can be effective, as long as there’s a time box, it’s okay. So, swarming as something that the team can drop into occasionally, mobbing as a general practice, and pairing is the default. Ideally, a team should pair by default, which gives them the maximum benefit because they have the greatest number of stories in progress with two people working together at a time, but they have the benefit of having that shared information where it isn’t just in one person’s head. Markus: How much — okay, I know what you’re going to say, I feel, but I’m going to ask it anyway because I’m imagining I’m a listener, and it was me so many years ago, and I heard about this and I might ask, “Well doesn’t productivity drop a whole lot? Statistically, can’t we measure that two people are just faster on two stories? And isn’t that the fundamental parallel processing idea? How does pairing effect, quote-unquote, “productivity?” David: So, these practices actually do not impact productivity negatively. Because while you do have fewer channels of development happening simultaneously, each channel of development is much more robust and produces much higher quality code, so, there is less back-work, there’s less failure in the codebase. It’s easier to understand the code because it’s been written in such a way that more than one person can look at it and understand it. It’s been written to meet coding standards that the team has established mutually because they have to work together, so everybody has to be able to look at the code simultaneously. As a result, what’s actually produced and pushed out into production is much more stable and much stronger. So, the fact that there aren’t as many channels of development at the same time results in an equal amount of productivity, but with higher quality. Markus: Okay, so we’re not going to see half the number of features, or stories, or whatever we’re counting, in theory. But, now when we think about — David: But we might, honestly — Markus: Oh? David: — because if what we’ve been looking for is the concept of, let’s push the code out into production as quickly as possible, no matter how bad it is, and then work on the next feature, and then work on the next feature, and never fix the features that we’ve been putting out there, and building up technical debt at a rate like that? Yeah, we might see fewer features that are built like that going out into production. Markus: I feel like you did a thing there where you redefined productivity. [laughing]. Well, let’s talk about mobbing then. I have questions, so many questions. So, the first one is, is there an effective limit of how many people can mob together? David: So, I do like the concept of the two-pizza team: as many people as could eat two pizzas at a given lunch. I like a team that has ideally an even number of engineers, maybe between five and nine somewhere in there. But that seems to be about the sweet spot. Eight and nine is getting a little bit high for that, where the team might feel that it’s hard for everybody to stay fully engaged. The team might break into two mobs at that point. Markus: I was actually gonna ask. I’m imagining six people swarming around — mobbing, let me get the words right — mobbing around one machine. And I’m imagining that it might be awfully tempting or I’m trying to think about how to say this, but just the idea that not everybody may be engaged. People are looking at their phone, they’re feeling like, “Well, I’m not in front of the keyboard, or it’s hard to see the screen so I’m not really a part of this.” Does that happen, and is mobbing a skill that one must learn over time? David: It is a skill. But it’s not one that takes a long time to learn. It does take a team’s engagement, though. It forces the team to stay more fully engaged and to have permission to keep other people engaged in the process. There’s a concept that sometimes comes up early on when I’m working with a team, where somebody will turn into what I would call a Tesla, which is a self-driving driver, sitting there at the keyboard, not actually waiting for anybody to navigate and give feedback about the work that’s going on, but rather, sitting there at the keyboard and also coming up with what’s going to be coded and doing the coding at the same time. Basically, somebody’s working alone, but on a big screen in front of a team full of people who are just watching. That is a practice that I try to discourage, and I do that by actively encouraging people to speak anytime that I hear silence in the dojo. I want there to be a constant chatter, a constant discussion, and I like to make sure that every voice has the opportunity to be heard. Getting that practice over the course of a six-week dojo, I hope, encourages the team to carry that forward after the dojo completes, and it’s up to each team to figure out how they’re going to establish their norms so that people don’t feel left out. Markus: That’s really cool. Is there a particular time when you encourage teams to move from pairing to mobbing? A particular type of problem, or is it just like, “Oh, it’s Thursday so we’ll mob.” David: I like the teams to have experience with both modalities so that they can choose based on the specific type of project that they’re working on, whether or not this is something we’d like to pair on or something we’d like to mob on, and generally, teams will choose to mob on things that everybody on the team wants to learn about, and that everybody wants to be fully engaged in the process. They don’t want anybody else to feel left behind, so they’ll want everybody to be mobbing on that. And as I say, there are teams that choose to mob 100 percent of the time, which is just fine. It’s completely up to the team based on the way that they work. Markus: Does management have a view, when they come out of the dojo, and they’re now five, six, eight people in front of one screen? Do they have a perception of or a feeling about this huddle and productivity or other things? David: Management tends to be very supportive of this because it increases happiness on the teams, it increases engagement, it reduces the other problems that can come up for management, and it brings these things to the surface, so they have the opportunity to be discussed right away. Markus: So, this has been just fantastic for me, I’ve learned a ton. So, a quick recap. So, if you’re listening and you’re interested in the concept of a dojo, David, remind us of the URL people go to to find out about the group that does and runs these dojos. David: Yes, I’m part of a consortium called, and their URL is dojoconsortium.org. Markus: Great, okay, so there is probably you can find more information, you could get in touch with these Agile Coaches that do this brilliant kind of work. And if you’re not playing — maybe this is my last question, David, if people aren’t yet playing with XP concepts, especially with how multiples of humans write software together, is there some resources you’d like to recommend for considering how pairing or mobbing might fit into the organization? David: Well, you mentioned Kent Beck’s book. And there’s a series of books that he’s involved with that can all help with that. But, when we’re at direct people right away, if they just want an overview, and they want to start applying these things right away, I direct them to extremeprogramming.org, which is a website. It’s been around for years, and it encapsulates some of the basic concepts. It’s an opportunity to learn about how Extreme Programming was conceptualized, how it works, and it’s got the fundamentals there. You can really get started with just that. Markus: Cool. David, where can people find you online, engage with you, and with your work? David: Sure. Well, I’m M. David Green pretty much everywhere, so all of the social media. I have a podcast called Hack the Process, which you can also look at. I talk to people about various techniques that they can use to improve their productivity and to move mindfully past having an idea in your head to having something out there in the world that you’re proud of. And if you’re interested in my book on Scrum, I just wrote a book on Scrum called Scrum: Novice to Ninja, which you can also find out there. Markus: Wonderful. Thank you so much for being on the show. David: Thank you for having me. [laughing]. Announcer: Thank you for listening to Programming Leadership. You can keep up with the latest on the podcast at www.programmingleadership.com and on iTunes, Spotify, Google Play, or wherever fine podcasts are distributed. Thanks again for listening, and we’ll see you next time. The post No Fighting In This (Agile) Dojo with M. David Green appeared first on Marcus Blankenship.
https://medium.com/programming-leadership/no-fighting-in-this-agile-dojo-with-m-david-green-73a5018d8c65
['Marcus Blankenship']
2020-06-11 07:56:59.056000+00:00
['Management', 'Leadership', 'Software Development', 'Startup', 'Technology']
256
Will EIP-1559 be a Solution to Ethereum’ High Gas Fee Issue?
It is clear that the continuously congested blocks account for the all-time high gas prices. According to the report by Coin Metrics, an analytics provider, after having delving into the world of Ethererum transaction fees, it is notable that they remain the highest-ever levels and even a highly anticipated upcoming network upgrade is unlikely to ease the condition. Currently, median fees on Ethereum have been consistently over $10 for most of 2021. Comparatively, the average Ethereum transaction fee reached just $5.70 at the height of the 2017/2018 bullish market. It attributed some of this increase to the increment in ETH prices themselves which will make gas more extortionate. Since the beginning of 2021, ETH has surged 125% to current prices despite a correction of 19% from its all-time high of $2,050. However, over the same period, the median gas price has increased by 532%. Organically, different types of transactions require different amounts of gas — a simple ERC-20 token transfer uses much less gas than a complex smart contract operation for an automated market maker for example. However, it noted that DeFi itself is cannot result in the high gas fees, it is simply more transactions in general. Since January 2020, the amount of gas needed per transaction has went downwards. This explains that increased transaction complexity is not accountable for high transaction fees. Ethereum transactions are currently auctioned, with those paying more gas taking miner priority and getting faster transactions than those that have set a lower gas limit. The report noted that the current high fees are because the blocks are consistently full, around 95%, and have been since mid-2020 and the DeFi boom. For March 2021, Ethereum blocks have been 97%-98% full, data based on the research. It is expounded that miners need to specify which transactions to include when mining new blocks and each block can only include a limited number of transactions (on average 160 to 200) due to the maximum block size. Thus, it is still too soon to conclude if the long-awaited EIP-1559 network upgrade, which has been designed to change the auction mechanism and burn some of the fees, can solve the problem of high gas costs, seems like only scaling solutions will be the true long-term fix. It is only can be said that the upgrade will only help make fees more predictable as the root of high fees is the scalability problem.
https://medium.com/@biboxexchange/will-eip-1559-be-a-solution-to-ethereum-high-gas-fee-issue-ca107785c41b
['Bibox Exchange']
2021-04-01 03:40:08.633000+00:00
['Ethereum Blockchain', 'Blockchain Technology']
257
This Big Chill —we wish you know that success will come with time
This Big Chill —we wish you know that success will come with time ONEROOT Jan 20, 2019·1 min read While the Winter Solstice occurs when one of the Earth’s poles has its maximum tilt away from the Sun, it isn’t the coldest day of a year in China but today, the Big Chill is. The Big Chill is the 24th Solar Term. A solar term is any one of the 24 points in traditional Chinese lunisolar calendars that matches a particularly astronomical event or signifies some natural phenomenon. This is an hysteresis effect. Some people often can’t wait to see the results and switch to another track in a hurry, with ignorance of the success may arrive late. Today’s success often comes from yesterday’s efforts. On January 20, 2019, this Big Chill, we wish you know that success will come with time. This Big Chill ________________ English: Telegram-en/ONEROOT network/R1 protocol/Reddit/Twitter/Medium/Linkedin/Facebook/Github Instagram: @onerootnetwork Youtube: @OneRoot Project Korean: Naver blog/Kakao Chinese: Telegram-cn/Weibo Official QQ group: 6644849 Official WeChat account: oneroot_admin
https://medium.com/@oneroot/this-big-chill-we-wish-you-know-that-success-will-come-with-time-27a05117ed80
[]
2019-01-30 10:28:30.143000+00:00
['Smart Contract Blockchain', 'Oneroot Project', 'Ethereum Blockchain', 'Blockchain Technology']
258
Barkis Network 2.18–3.03 Biweekly Report
Thanks for your understanding and company all the time. Barkis Network aims to build a distributed business application value network based on blockchain technology. It is convinced that Barkis community will develop better with your participation. Here is the Barkis Network progress between 2.18–3.03: Progress in public chain development 1. Upgraded Barkis mainnet to support issuing custom token function. 2. Performance optimization. Progress in wallet development 1. Android wallet upgrade, together with the public chain upgrade, now supports custom token viewing and transfer. 2. Android wallet supports issuing custom token. Blockchain browser update 1. Upgraded of blockchain browser to support the issuance and browsing of custom tokens. 2. Bug fixes. Progress in MasterNodes Plan development 1. Delegated (voting) functionality for MasterNodes was released. Barkis Network will continue to be a responsible project as always, and create more value for users and the industry in the future. The future has come, let us shout for the Barkis Network!
https://medium.com/@barkis-network/barkis-network-2-18-3-03-biweekly-report-467b86fc220e
['Barkis Network']
2020-03-03 14:28:33.016000+00:00
['Public Chain', 'Network', 'Blockchain', 'Blockchain Development', 'Blockchain Technology']
259
Performance optimization during legacy system migration
Sometimes even a successful business should revise its approaches to keep up with modern demands and solutions. Some issues appear in management, the others in recruitment, but the most crucial ones always concern technologies you use. Technology systems are critical to success. While updating your capabilities, without the proper system migration, it is impossible to meet the needs of customers, management, and other stakeholders. How to make your company profitable again? How to avoid bad delivery of the product? And how to improve performance optimization? Ask UppLabs about legacy product migration WHY DO YOU NEED A LEGACY MIGRATION? There are several reasons for migrating from old legacy systems. Of course, when starting this process, you need to prepare for a multi-level project, but by hiring a good migration solutions architect, you can make the process as smooth as possible. Some possible reasons to rebuild legacy software could be: Outdated software costs too much to maintain. Outdated software is challenging to update. Feature inclusions have changed the original vision of the product. The company’s rates are decreasing as potential clients are looking for more modern solutions. The original build doesn’t have enough accessibility, resulting in software that alienates the users and exposes the company to litigation. UX patterns have expanded and affected older software. Mobile compatibility may not have been included when the original code was written. The new markets affect current users who are getting tired of a slow application update. THE PROCESS OF MIGRATION Data migration is a complex process that often requires a separate project, approach, plan, budget, and team. It is often connected with many time-consuming tasks that may be invisible at the beginning stage of the project. We can highlight the following main steps of the legacy system migration: 1. Discovery and Analytics: identifying the data, format, location, and security measures 2. Creating a detailed plan: determination of time, technical, and financial requirements 3. Backing up all your data in case of any failures 4. Executing your data migration plan: – database changes; – code changes; – connection changes; 5. Testing the system after each phase of your migration 6. Performing check-ups and any maintenance of your data migration plan CHALLENGES OF DEVELOPING LEGACY SYSTEM MIGRATION There are several issues that you can face when you start the legacy system migration process. The below list includes some of the most common challenges. Depending on your organization’s goals and industry and the specifics of your legacy system, you may experience some problems: 1. Getting the needed technical expertise quickly Finding the right team is one of the most critical tasks. The success of the whole project usually depends on the right choice. Among the essential advantages of the technical expertise team are: The entire teamwork is easier to synchronize; The workflow within the whole team is more precise and transparent; Mutual cooperation creates equal responsibilities; The development capacity can be easily scaled; It’s a fast way to form a team. All specialists are ready, and processes are in place. 2. Meeting the deadlines of the business roadmap and timeline. Successful data migration is based on the exact details defined in the migration roadmap. It is a plan describing what to do and when, and what to do in case of mistakes or problems. A well-thought-out and accurate plan require an additional time investment, superior knowledge, and experience. The roadmap determines the method applied to import and export data, rebuild the network, and prepare resources and a detailed timeline to speed up the project’s development and launch. 3. Quick scaling up and scaling down of the development team and resources. Any business always has to inspect all possible changes and prepare a strategy that can be adopted by allocating and scaling the development team and budget, resources, and time. Here it’s better to prepare best practices and distribute them among all members or groups. The more you work to go through the changes, the more adaptable your business will become. 4. Delivering the solution that is on budget The migration process can be expensive in terms of both time and money. The software can often cost a lot itself, and transferring it to a new system often brings more payments that can sometimes overlap. Delivering the solution you need and still be on the budget can sometimes be a challenge for stakeholders. 5. Delivering the quality product Regardless of the business, many aspects need to be considered when migrating data, such as regulatory impact and all business processes. This process has to involve various stakeholders of the company and sponsors, and the project team members. That’s how everyone can understand their role in achieving the result. It is critical for any business. 6. Cost-effectiveness of the project The following is a checklist of questions to ask for the estimator of the project regarding the cost-effectiveness: What solutions does the migration include? Is the target platform database compatible with the source database? Estimate the size and complexity of the data to determine the amount of time and resources needed. What efforts do the physical transfer of data require? Are resource requirements stable? Is the migration team experienced with the technologies available? 7. Performance Performing and testing the entire process at once can result in low organizational efficiency and can hold and affect your entire business. Optimizing the performance and testing all the stages can take longer but results in estimating the vital business processes in the meantime. Performance includes many issues itself that’s why we’ll be delving it this process in more detail below. PERFORMANCE OPTIMIZATION Depending on the type of product, performance is based on several indicators, such as: Number of users who use the product simultaneously For example, what is the User Experience? What is the product’s response time? Is there a lag? How many requests at the same time? Does the number of requests spike at a given time? Why? Are user sessions getting longer or shorter? In one of our recent projects, UppLabs had to fix the legacy solution that could not support more than 2000 real-time users, and that was affecting the core business goals. Our team had to change the app’s architecture and gradually migrate it from monolith to microservices. As a result, in two months, UppLabs managed to increase the solution’s performance three times. Data in the database The data play a significant role in migration. They become integrated into the business, change over time, causing the interdependencies of datasets, which complicates their transfer to a new environment. It is crucial to research the data, applications, and system components used for their complexity and interdependencies to improve data management and application mobility. Thus, the level of data integration is a critical part of the migration process. Safety Data protection is a must; all data must be migrated safely to avoid potential data losses. It’s essential to make sure all data are protected and can be extracted securely. The old and new formats should be compatible. That’s the stage where the team has to take the time to test and review all the data carefully. Duration How long does each phase of the migration take? How long does it take to test the whole process and each step? Does the term of each phase match expectations? If no, why, and what actions need to be taken? Application Performance This indicator can show exactly if your migration was successful. This is usually measured through: – error rates (failed requests/total requests), – speed, – application availability, – latency, – number of time-outs, and – throughput. PERFORMANCE INDICATORS FOR FINTECH, HEALTHCARE AND REAL ESTATE DOMAINS If we divide product types into three primary industries, we can select such key Performance indicators: Download performance indicators One thing is exact: if your outdated product does not meet your goals and requirements — then it should be optimized because if the performance decreases, this may affect the whole business. UPPLABS CASE STUDY In UppLabs practice, we came across an exciting technical case for which we had to provide a legacy rebuild. This case can prove an example of a successfully improved product which performance has risen three times. We had a project with a concrete goal — optimization of application performance by migration from a monolithic system to the new microservices’ infrastructure. The UppLabs team found a curious approach to fulfill this task. The team offered several solutions to the client: Rewrite the application from scratch to microservice architecture and following the best code practices. It required more than one year of the development of a team of 5+ members. Implement one microservice covering a small part of the functionality but the most used one, bringing the most significant performance issues. It required around two months of the development of 2 team members. The client approved the second solution, and the team started to implement it. Besides monolith, the client’s project had problems with code structure, so it has to be rewritten from the very beginning. Our team decided to put the logic into the microservice, creating a Public Getaway API that can be easy to communicate for both sides — the clients of the existing project and the existing businesses. The main challenge for us was to find a solution that can be realized in a short period and can solve the client’s business problems. However, microservices architecture can be considered a complicated solution; it appears much more comfortable from potential support and scalability.
https://medium.com/@upplabs/performance-optimization-during-legacy-system-migration-b60a3821edf6
[]
2020-12-17 16:44:22.040000+00:00
['Performance', 'Technology', 'Optimization', 'Software Development', 'Business']
260
Rust Binary Tree: A Refactor
Rust Binary Tree: A Refactor Making Our Binary Tree Better Photo by Matt Lamers on Unsplash Rust is a beautiful and complex beast. Despite being a truly low-level language, it has amazing features and abstractions available for us as developers. I personally have a (functional) JavaScript and C# background, and am rather new to Rust by comparison. So it amazes me when I see elegantly written Rust, especially code that really takes advantage of the awesome language features. I recently released this article about Binary Trees in Rust. I received a bit of feedback on Reddit: Coriolinus also sent me a helpful Rust playground link. I’m going to walk through the modifications they’ve made to my code and why I actually really like their implementation! In the end we’ll simplify the code significantly, drop unnecessary wrapper types, make our code more generic, and also reduce the footprint of tree construction. If you’re new to Rust you’ll also get an opportunity to learn about Rust closures and pointer types— all thanks to coriolinus’ refactor. Previously, I had been thinking of the tree as two structs: This implementation allows for the possibility of some weird invariants being constructed. That’s not great, because that means a leaf could have been constructed with child nodes and when it comes to the operation those child nodes would be ignored. Or our child nodes could be None and have an operator other than Id, which is also an invariant. What this does is open the possibility for you, the developer, to make a mistake later on — simply because the possibility exists. So let’s take a look at an improvement: Terser, and removes our invariants. We will be able to remove our Op enum as well as our Binary Tree wrapper — both have become unnecessary because we have encapsulated the variants of the Node itself in a generic enum. We can remove our Child Node type alias. We can stop wrapping our nodes in Option containers. Also note, we’ve replaced the op field’s definition as being a pointer to a function. Using the dyn modifier indicates that the type provided will be a Trait Object. This is the typical way to indicate you’ll be providing a closure as a parameter. Let’s see how this will change the way that we can implement operators. Elegantly written rust, provided to us by coriolinus from Reddit. I learned some things from looking at this piece of code. It heavily utilizes language features and is more semantic Rust than my original implementation. It even keeps things more generic. Let’s walk through each bit of it. On the few lines, we keep things generic by using impl <T> . Then we add a trait-bound to T using a where clause. We indicate that T implements the Add trait and we set the Add ’s Output type parameter equal to T . Now the meat of it — we declare a function to construct a non-terminal node variant (a Branch ). It takes some type parameters, but because of the creative solution we’ve been provided, these parameters will ultimately be elided. We also have some trait bounds applied to the function. Both L and R have the Into<BTNode<T>> bound. This means that as long as the provided parameters at L and R can be converted into a BTNode<T> , they will pass the type check and be converted into our BTNode wrapper. If you’re lost, this will make a little more sense when we implement From on BTNode later! The body of the function add is much more straightforward. We construct and return a Branch variant of the BTNode . It’s left and right values are Box ed up, because we still need the layer of indirection that prevents us from creating a recursive type. We also call .into which is provided by implementing the Into bound from before. That’s the part that converts our BTNode -convertible value into an actual BTNode . Finally, op is declared as a closure that simply adds its l and r parameters. This implementation is highly modular and extensible. You can basically copy that code for each operator you’d like to implement, replacing the std::ops::Add to whatever operator you’ll be using along with replacing the closure with an appropriate operation. You could use it to extend BTNode s to be operable on any type of your choosing — without specifically indicating what T will be until construction. Now let’s go over From : What this piece of code says is for any T , we can construct a BTNode from that T . The node variant it will become is a Leaf . From is a conversion trait, and Into is its sister trait. Using them together, both in the impl blocks for our operators and in the impl<T> From<T> for BTNode<T> block, we can get implicit conversion (or, I suppose, elided conversion) from our parameter values. Let’s bask in that for a moment — what does this entail? This means: We can convert virtually any type into a Binary Tree Leaf node. Sure, it’s just a wrapper. But with the type-restricted implementations for our operators, we can make whatever T we want able to be provided to our constructors as a raw value, and it will turn it Into a BTNode leaf variant of that type T . That’s what makes this implementation so highly extensible and expressive. Now let’s take a look at how we will collapse our tree. Before, we had a bulky collapse function as a method implemented on BinaryTree . Well, now we don’t even have a BinaryTree struct! So how will this be accomplished? Check out this (recursive) solution: Yet another amazing piece of code. Again, learned something from this block of code. Still staying generic, and abstracting away the need to provide a concrete type, we have another impl block for BTNode<T> but with no trait bounds this time. This houses the new constructor (which simply creates a Leaf from any T ), and a special function now part of the BTNode interface, value . This is the easiest way to recursively collapse the tree. In just one match expression, we decide whether to descend the BTNode children in a left-biased depth-first fashion (just as before), except instead of a bulky function, this is accomplished in essentially 3 lines. By having created an enum to house our node’s structure variants, it becomes very easy to extract the data. The recursion now occurs when we call .value() . That forces a new match case that either has found a leaf and can therefore provide a value to it’s outer branch’s op closure, or must descend into another call to .value() . I love it. I already liked Rust before I received some quick mentoring, but this code has really opened my eyes to what Rust can do. Before we conclude this refactor, let’s look at how we can construct our trees in practice: Simplified construction for Binary Trees. Its all node now! We finally provide a concrete type to our nodes when we construct the tree. Because of our conversion implementation, we can provide parameters as raw i32 values without jumping through any hoops. The great part is that this will work for other types now, too. And it’s as simple as calling the instance method .value() to collapse the tree! Very cool and very nice work. A big thanks to Coriolinus for this wonderful refactor! Here’s the Rust Playground link from the original reply. Hack away! I learned a lot about Rust just from this refactor, and I hope that my readers have as well. But the best thing that I learned through this experience is how dedicated and responsive the Rust community is. Seriously, if you want to learn Rust, or are stuck on a concept, ask on Reddit’s r/rust community. You will very likely receive feedback from talented developers. Anyway, I hope you’ve enjoyed another adventure in my Rust journey, and until next time FP on! (P.S. I did ask for permission to post about this code!)
https://medium.com/swlh/rust-binary-tree-a-refactor-1b090a88e24
[]
2020-10-31 13:36:29.581000+00:00
['Software Engineering', 'Technology', 'Data Science', 'Programming', 'Rust']
261
This Giant E-Ink Tablet Is a Dream Device for Reading and Taking Notes
This Giant E-Ink Tablet Is a Dream Device for Reading and Taking Notes Sbnd Nov 23, 2020·10 min read Photos courtesy of the author I’ve been obsessed with e-ink since buying my first Kindle, but the technology has largely been relegated to reading books, despite its potential for so much more. The 10.3-inch reMarkable 2 ($399) takes e-ink and shows off its capabilities beyond e-books, as if someone finally took the shackles off. The reMarkable e-ink tablet has no apps, no notifications, and few features, outside of trying to do one thing well: writing with a pen, as if it were on actual paper — no additional distractions. It’s the antithesis of every gadget on the market today, which are jam-packed with as many features as possible, and it’s a breath of fresh air. I wanted to try the reMarkable 2 because I’ve found writing things down by hand helps me remember them, and it improves my focus. While paper has worked well enough for this throughout its long history, I often forget my notebook or don’t have it close when I need it. https://ims.utoronto.ca/video/nfol/Rams-v-Buccaneers-0.html https://ims.utoronto.ca/video/nfol/Rams-v-Buccaneers-2.html https://ims.utoronto.ca/video/nfol/Rams-v-Buccaneers-3.html https://ims.utoronto.ca/video/nfol/Rams-v-Buccaneers-4.html https://ims.utoronto.ca/video/nfol/Rams-v-Buccaneers-5.html https://ims.utoronto.ca/video/nfol/Rams-v-Buccaneers-6.html https://ims.utoronto.ca/video/nfol/Rams-v-Buccaneers-7.html https://ims.utoronto.ca/video/nfol/Rams-v-Buccaneers-8.html https://ims.utoronto.ca/video/nfol/Rams-v-Buccaneers-9.html https://ims.utoronto.ca/video/nfol/Rams-v-Buccaneers-1.html https://www.nationsreportcard.gov/codon/nfol/Rams-v-Buccaneers-0.html https://www.nationsreportcard.gov/codon/nfol/Rams-v-Buccaneers-1.html https://www.nationsreportcard.gov/codon/nfol/Rams-v-Buccaneers-2.html https://www.nationsreportcard.gov/codon/nfol/Rams-v-Buccaneers-3.html https://www.nationsreportcard.gov/codon/nfol/Rams-v-Buccaneers-4.html https://www.nationsreportcard.gov/codon/nfol/Rams-v-Buccaneers-5.html https://www.nationsreportcard.gov/codon/nfol/Rams-v-Buccaneers-6.html https://www.nationsreportcard.gov/codon/nfol/Rams-v-Buccaneers-7.html https://www.nationsreportcard.gov/codon/nfol/Rams-v-Buccaneers-8.html https://www.nationsreportcard.gov/codon/nfol/Rams-v-Buccaneers-9.html https://ims.utoronto.ca/video/nfol/Rams-v-Bucc-liv-op-tv-nfl01.html https://ims.utoronto.ca/video/nfol/Rams-v-Bucc-liv-op-tv-nfl02.html https://ims.utoronto.ca/video/nfol/Rams-v-Bucc-liv-op-tv-nfl03.html https://ims.utoronto.ca/video/nfol/Rams-v-Bucc-liv-op-tv-nfl04.html https://ims.utoronto.ca/video/nfol/Rams-v-Bucc-liv-op-tv-nfl05.html https://ims.utoronto.ca/video/nfol/Rams-v-Bucc-liv-op-tv-nfl06.html https://ims.utoronto.ca/video/nfol/Rams-v-Bucc-liv-op-tv-nfl07.html https://ims.utoronto.ca/video/nfol/Rams-v-Bucc-liv-op-tv-nfl08.html https://ims.utoronto.ca/video/nfol/Rams-v-Bucc-liv-op-tv-nfl09.html https://ims.utoronto.ca/video/nfol/Rams-v-Bucc-liv-op-tv-nfl10.html https://ims.utoronto.ca/video/nfol/Rams-v-Bucc-liv-op-tv-nfl11.html https://ims.utoronto.ca/video/nfol/Rams-v-Bucc-liv-op-tv-nfl12.html https://ims.utoronto.ca/video/nfol/Rams-v-Bucc-liv-op-tv-nfl13.html https://ims.utoronto.ca/video/nfol/Rams-v-Bucc-liv-op-tv-nfl14.html https://ims.utoronto.ca/video/nfol/Rams-v-Bucc-liv-op-tv-nfl15.html https://ims.utoronto.ca/video/nfol/Rams-v-Bucc-liv-op-tv-nfl16.html https://ims.utoronto.ca/video/nfol/Rams-v-Bucc-liv-op-tv-nfl17.html https://ims.utoronto.ca/video/nfol/Rams-v-Bucc-liv-op-tv-nfl18.html https://www.nationsreportcard.gov/codon/nfol/patra-nfl-kzm10.html https://www.nationsreportcard.gov/codon/nfol/patra-nfl-kzm11.html https://www.nationsreportcard.gov/codon/nfol/patra-nfl-kzm12.html https://www.nationsreportcard.gov/codon/nfol/patra-nfl-kzm13.html https://www.nationsreportcard.gov/codon/nfol/patra-nfl-kzm14.html https://www.nationsreportcard.gov/codon/nfol/patra-nfl-kzm15.html https://www.nationsreportcard.gov/codon/nfol/patra-nfl-kzm16.html https://www.nationsreportcard.gov/codon/nfol/patra-nfl-kzm17.html https://www.nationsreportcard.gov/codon/nfol/patra-nfl-kzm18.html https://www.nationsreportcard.gov/codon/nfol/patra-nfl-kzm19.html https://www.nationsreportcard.gov/codon/nfol/patra-Bucc-v-Rams-kzm00.html https://www.nationsreportcard.gov/codon/nfol/patra-Bucc-v-Rams-kzm01.html https://www.nationsreportcard.gov/codon/nfol/patra-Bucc-v-Rams-kzm02.html https://www.nationsreportcard.gov/codon/nfol/patra-Bucc-v-Rams-kzm03.html https://www.nationsreportcard.gov/codon/nfol/patra-Bucc-v-Rams-kzm04.html https://www.nationsreportcard.gov/codon/nfol/patra-Bucc-v-Rams-kzm05.html https://www.nationsreportcard.gov/codon/nfol/patra-Bucc-v-Rams-kzm06.html https://www.nationsreportcard.gov/codon/nfol/patra-Bucc-v-Rams-kzm07.html https://www.nationsreportcard.gov/codon/nfol/patra-Bucc-v-Rams-kzm08.html https://www.nationsreportcard.gov/codon/nfol/patra-Bucc-v-Rams-kzm09.html https://joycedayton.com/sites/default/files/webform/r-b-00.html https://joycedayton.com/sites/default/files/webform/r-b-01.html https://joycedayton.com/sites/default/files/webform/r-b-02.html https://joycedayton.com/sites/default/files/webform/r-b-03.html https://joycedayton.com/sites/default/files/webform/r-b-04.html https://ims.utoronto.ca/video/nfol/patra-Bucc-v-Rams-kzm00.html https://ims.utoronto.ca/video/nfol/patra-Bucc-v-Rams-kzm01.html https://ims.utoronto.ca/video/nfol/patra-Bucc-v-Rams-kzm02.html https://ims.utoronto.ca/video/nfol/patra-Bucc-v-Rams-kzm03.html https://ims.utoronto.ca/video/nfol/patra-Bucc-v-Rams-kzm04.html https://ims.utoronto.ca/video/nfol/patra-Bucc-v-Rams-kzm05.html https://ims.utoronto.ca/video/nfol/patra-Bucc-v-Rams-kzm06.html https://ims.utoronto.ca/video/nfol/patra-Bucc-v-Rams-kzm07.html https://ims.utoronto.ca/video/nfol/patra-Bucc-v-Rams-kzm08.html https://ims.utoronto.ca/video/nfol/patra-Bucc-v-Rams-kzm09.html https://ims.utoronto.ca/video/nfol/patra-nfl-kzm10.html https://ims.utoronto.ca/video/nfol/patra-nfl-kzm11.html https://ims.utoronto.ca/video/nfol/patra-nfl-kzm12.html https://ims.utoronto.ca/video/nfol/patra-nfl-kzm13.html https://ims.utoronto.ca/video/nfol/patra-nfl-kzm14.html https://ims.utoronto.ca/video/nfol/patra-nfl-kzm15.html https://ims.utoronto.ca/video/nfol/patra-nfl-kzm16.html https://ims.utoronto.ca/video/nfol/patra-nfl-kzm17.html https://ims.utoronto.ca/video/nfol/patra-nfl-kzm18.html https://ims.utoronto.ca/video/nfol/patra-nfl-kzm19.html https://www.nationsreportcard.gov/codon/nfol/Rams-v-Bucc-liv-op-tv-nfl01.html https://www.nationsreportcard.gov/codon/nfol/Rams-v-Bucc-liv-op-tv-nfl02.html https://www.nationsreportcard.gov/codon/nfol/Rams-v-Bucc-liv-op-tv-nfl03.html https://www.nationsreportcard.gov/codon/nfol/Rams-v-Bucc-liv-op-tv-nfl04.html https://www.nationsreportcard.gov/codon/nfol/Rams-v-Bucc-liv-op-tv-nfl05.html https://www.nationsreportcard.gov/codon/nfol/Rams-v-Bucc-liv-op-tv-nfl06.html https://www.nationsreportcard.gov/codon/nfol/Rams-v-Bucc-liv-op-tv-nfl07.html https://www.nationsreportcard.gov/codon/nfol/Rams-v-Bucc-liv-op-tv-nfl08.html https://www.nationsreportcard.gov/codon/nfol/Rams-v-Bucc-liv-op-tv-nfl09.html https://www.nationsreportcard.gov/codon/nfol/Rams-v-Bucc-liv-op-tv-nfl10.html https://www.nationsreportcard.gov/codon/nfol/Rams-v-Bucc-liv-op-tv-nfl11.html https://www.nationsreportcard.gov/codon/nfol/Rams-v-Bucc-liv-op-tv-nfl12.html https://www.nationsreportcard.gov/codon/nfol/Rams-v-Bucc-liv-op-tv-nfl13.html https://www.nationsreportcard.gov/codon/nfol/Rams-v-Bucc-liv-op-tv-nfl14.html https://www.nationsreportcard.gov/codon/nfol/Rams-v-Bucc-liv-op-tv-nfl15.html https://www.nationsreportcard.gov/codon/nfol/Rams-v-Bucc-liv-op-tv-nfl16.html https://www.nationsreportcard.gov/codon/nfol/Rams-v-Bucc-liv-op-tv-nfl17.html https://www.nationsreportcard.gov/codon/nfol/Rams-v-Bucc-liv-op-tv-nfl18.html https://ims.utoronto.ca/video/nfol/bucca-vs-rams-liv70.html https://ims.utoronto.ca/video/nfol/bucca-vs-rams-liv71.html https://ims.utoronto.ca/video/nfol/bucca-vs-rams-liv72.html https://ims.utoronto.ca/video/nfol/bucca-vs-rams-liv73.html https://ims.utoronto.ca/video/nfol/bucca-vs-rams-liv74.html https://ims.utoronto.ca/video/nfol/bucca-vs-rams-liv75.html https://ims.utoronto.ca/video/nfol/bucca-vs-rams-liv76.html https://ims.utoronto.ca/video/nfol/bucca-vs-rams-liv77.html https://ims.utoronto.ca/video/nfol/bucca-vs-rams-liv78.html https://ims.utoronto.ca/video/nfol/bucca-vs-rams-liv79.html https://www.nationsreportcard.gov/codon/nfol/bucca-vs-rams-liv70.html https://www.nationsreportcard.gov/codon/nfol/bucca-vs-rams-liv71.html https://www.nationsreportcard.gov/codon/nfol/bucca-vs-rams-liv72.html https://www.nationsreportcard.gov/codon/nfol/bucca-vs-rams-liv73.html https://www.nationsreportcard.gov/codon/nfol/bucca-vs-rams-liv74.html https://www.nationsreportcard.gov/codon/nfol/bucca-vs-rams-liv75.html https://www.nationsreportcard.gov/codon/nfol/bucca-vs-rams-liv76.html https://www.nationsreportcard.gov/codon/nfol/bucca-vs-rams-liv77.html https://www.nationsreportcard.gov/codon/nfol/bucca-vs-rams-liv78.html https://www.nationsreportcard.gov/codon/nfol/bucca-vs-rams-liv79.html Over the years, I’ve tried switching to a digital alternative, like the iPad and Microsoft Surface, but nothing stuck. A computer is too distracting, particularly for someone with a short attention span like me. It’s too easy to get lost in a different app instead of actually taking notes. The entire premise of the reMarkable tablet is that it’s optimized for using a pen to draw or write notes, rather than typing, with literally nothing else to distract you. It’s ultrathin at 4.7mm and beautifully designed, as if it were a high-end Moleskin, albeit with a digital twist. The tablet sports USB-C for charging and file transfer, along with Wi-Fi for syncing to the company’s desktop and mobile apps. Out of the box, the reMarkable boots up and invites you to start by just drawing on it during setup, providing a hint at just how focused this device is. The e-ink display is coated in a satisfying texture, providing a paper-like feel while you draw or write, which creates an experience that’s eerily similar to writing in a physical notebook. The notebook functionality takes up the entire home screen. When creating a new “notebook,” you can choose the type of “paper” from a range of templates, such as lined, dotted or a grid, then start taking notes or drawing. Swipe across the screen whenever you want a fresh page. From there, you can choose from a marker, ballpoint pen, and so on. With the basic pen, you’ll need to manually tap the eraser icon to undo mistakes, but if you jump for the more expensive $99 “Marker Plus” version of the pen, you can erase by using the top of the pen, as if it were an actual pencil (it’s worth the upgrade over the normal pen, which costs $49 — the device does not come with one by default). What surprised me most about writing on the reMarkable is how good the pressure sensitivity is on the pen, and how low the latency is as you draw and write — it’s good enough that it feels like writing with a physical pen, on real paper. I’m not particularly good at drawing, but over the last few weeks I’ve been using the reMarkable for taking notes during meetings and to remember tasks throughout the day. It’s been delightful for my memory to force myself to write things down by hand rather than trying to tap things into the Notes app on my computer, and keeping this habit helped me pay more attention to what’s going on as people talk in meetings. Because the tablet has Wi-Fi built in, you can hit a button after writing notes and have them transcribed into text, then sent via email, which is great for a quick recap or sharing with others. The transcription is serviceable, and did a good job of figuring out what I wrote despite my terrible handwriting — though I wish that the tablet transcribed everything automatically so it would be searchable, rather than requiring you to hit a button first. Your notes also sync to the reMarkable desktop and mobile apps, which I found useful for quickly pulling up an insight or meeting note when the tablet wasn’t handy, though the app is limited to showing images of your writing, and doesn’t offer a way to search the contents or turn convert the writing into text; that needs to be done on the tablet itself. reMarkable macOS app On top of all the writing features, the reMarkable also supports reading PDFs and e-books, which is particularly useful for things like textbooks thanks to the large display. You can annotate pages with the pen directly as you read for quick reference later, which I found myself doing a lot as I read a puppy training book over the last few weeks. As with normal note-taking, these show up seamlessly in the apps as well. It should be noted here, however, that the reMarkable doesn’t have a built-in backlight like a Kindle, so you need to use it in a well-lit room. I can understand why the company omitted this, given the focus on note-taking and reproducing writing on paper, but I found it disorienting at times — I simply expected it to have one, as has become common on e-ink readers. What I really wanted to use the reMarkable for, however, was disconnecting from my phone to try and stop doom scrolling so much. The company has a Chrome extension that allows you to click a button in your browser and throw a page onto your tablet for reading later, which is useful, but I was hoping it would support a service I already use, such as Pocket. On that note, the surprising news here is that the reMarkable is a refreshingly hackable device. It’s not locked down at all and runs a light version of the Linux operating system, which allows you to run whatever software you want on it by uploading via a SSH connection from a computer. The hacking community has embraced the device as a result and built out an array of customizations, including, yes, a rough Pocket integration and even a way to set the “sleep” screen to the latest front page of the New York Times. This gives me optimism about the future of the reMarkable as a platform — though I’ll admit that it’s very early days still — and I’m excited to tinker with it to see what I can do. Being able to tinker, and get under the hood of the reMarkable is a fabulous and surprising change of pace from locked down devices like the iPad. If you’re considering a reMarkable 2, you should know that it’s targeted at a very specific type of person that wants to take notes, by hand, but have them automatically digitized — without the burden of being distracted by a full-on tablet with notifications and tons of apps. Unlike almost every other tablet on the market, the reMarkable isn’t packed with features or full of apps; after opening it and tapping around for a few minutes, you might realize it doesn’t have a ton of functionality. But, that’s the entire point of this tablet: It’s a focused device that does very few things, but tries to do them really well. Occasionally, that focus left me wanting a little bit more integration with my existing workflows, be it syncing my notes into an app like Notion or playing nice with my saved articles in Pocket. Given the hackability of the device, however, it’s likely the community will come through on this front in time and build on top of the device where the company left off. In my opinion, it succeeds at the goal of being focused, especially as a digital notebook for an age in which we’re assaulted by distractions constantly — I love my Kindle for the same reason I fell in love with reMarkable; it doesn’t try to slather on features, it just gets out of the way to do the task at hand. Sure, the reMarkable 2 isn’t cheap, but that’s a price to pay for a device this focused from an independent company, rather than a tech giant. Now that I’ve used the reMarkable 2, my love for single-purpose devices has been rekindled. Instead of trying to be good at everything, reMarkable focused on being great at one thing: using a pen — and the tiny Norwegian company that built it knocked it out of the park.
https://medium.com/@hsjdbdkdbdb/this-giant-e-ink-tablet-is-a-dream-device-for-reading-and-taking-notes-6314e76ed368
[]
2020-11-23 23:44:12.967000+00:00
['Hardware', 'Consumer Tech', 'Gadgets', 'Technology', 'Tablets']
262
7 Things You Need To Do To Have Consistently Incredible Evenings
1 || Put your phone to bed early in the evening We’ve all heard of the psychological effects of using screens before bed, but they’re usually focused on eye strain and blue light — not about how damaging it can be to be followed around by a pestering technological device 24/7. While most studies point to the time spent on technology as irrelevant, it’s important to note how much time you spend thinking about things related to technology. In the evenings, how often do you spend time thinking about that work project you have due tomorrow, that assignment you need to wrap up, that text you’re expecting, or the emails you have to check? When you’re thinking about those things as much as you do, you might as well go ahead and check your email, finish that assignment, start your workday, or send a text yourself. “But with no iPhone to keep my mind wired, I was able to tune into my body and fall asleep according to its needs. Every single night of the experiment, I conked out within 10 minutes of getting in bed. And I didn’t make the connection at the time, but my stories were all written well before their deadlines that week.” — Amanda Montell, “The Benefits of Having an iPhone-Free Bedroom” plugging in your phone/laptop in another room 2 || Don’t start projects too late Put work away and don’t start something you know you won’t be able to finish. One thing I’ve found having more regular hours for my job as of late is how detrimental it can be to my evenings to haphazardly start work — especially harder projects that aren’t as urgent as I make them out to be. If there’s something big you need to get done, don’t forget to check and see if it could be done later in the week, before work one day, or earlier in the afternoon. “If you do nothing else, plan each day of your life with intention, purpose, and passion.”― Jeff Sanders One trick I’ve found useful, one that’s used by the likes of Jeff Sanders, podcaster behind The 5 am Miracle show, is an end time to the work day. Basically, barring an unprecedented event, at a certain amount of time — you stop working. Whether that’s 6 pm or 4 pm, you don’t do work at that point. You can also change how you define that. Maybe work for your real job isn’t allowed after 4 pm, and you only allow yourself to write your novel or your blog. However, you want to define that, stick to your plan. You may be thinking, though, “I won’t have time for my work” or “I’ll never finish my projects.” “Regret for the things we did can be tempered by time; it is regret for the things we did not do that is inconsolable.” — Sydney J. Harris, journalist While that might be the case and you might need to allow more time to get work done, setting a boundary will actually allow you to work more efficiently in the time you’ve allotted for it. Whatever you do, don’t start new work too late in the day. Aim to start early, finish early, and have time to do what you need to do to relax and settle down in the evenings. 3 || Respect a boundary between work and play One really hard thing, closely tied to projects started too late, is maintaining a healthy boundary between work and play. I have vivid memories of growing up, watching TV with my family, and all of us being on devices, doing work, school, or a personal project that would have been much more enjoyed and efficiently completed without a distracting show in the background. “W ork refers to the effort someone makes that has value to the person or society or a sustained physical or mental effort to overcome obstacles and achieve and objective or result. Play can be described as any activity someone finds enjoyable and interesting and is valuable in itself for that reason.” — Montessori Child Development Center Not everything you do needs to be a side hustle. Some things can be done just for fun. One thing that really changed my life for the better towards the end of high school and on into college was the realization that I could write and not worry about doing it for the money. While I write a blog here as part of my work and am paid for other writing projects, I write fiction for fun — and freeing myself of the hustling aspect has helped me to enjoy myself so much more. It’s also allowed my evening writing time to be spent much more restfully. 4 || Look over/create your plan for the next day Many people point to the issue of willpower that can become a problem in the morning. If you happen to wake up tired and groggier than usual, having a preset plan to rely on can really save the day. And while you can save some of the planning for the morning of, assessing your current mood and anything that’s come up/come to your mind during your sleep, you can already have a list of things to do, or a calendar with your pre-arranged commitments filled in. “If you don’t know where you are going, you’ll end up someplace else.”― Yogi Berra Also, if you happen to wake up a little out of it or otherwise unprepared for the day, you have a plan that you can rely on to get you started — something you’ve thought about beforehand that you can launch into as soon as you wake up. Even if you prefer to plan in the morning, you can at least already have an idea of what your day is going to look like, and leave it to your sleeping self to think of where all the pieces should go. 5 || Enjoy yourself I know, I know —maybe it’s obvious, but in a world that’s so focused on extreme productivity, going to bed early, and setting yourself apart, it’s difficult to remember that we are designed to relax, reset, and recharge in ways that differ depending on our personality. Whatever your relaxer of choice is, make sure you make time for it in your evening. That will do you a lot more good than spending an unfocused hour on work or being distracted by email while trying to play with your kids. “While accomplishing your dreams, don’t forget to enjoy life too.” — Unknown Whether relaxing is watching a film once a week with your wife, playing a game with your kids before bed, reading a novel, watching a comedy sketch, or talking to an old friend on the phone, find something that fills your soul of and gives you the opposite of stress in your life. It’s worth making time for. 6 || Be realistic with what you can make happen Let’s be rea,l if you get home from your day job or other vocation-related commitment at 5 pm, and you aim to go to bed at 9 pm every evening, you only have four hours. That’s one hour for dinner, one hour for your spouse, one hour for reading and relaxing and getting ready for bed, and one hour for something else. “You can do anything, but not everything.” — David Allen You can’t realistically spend four hours rigorously writing your novel with that kind of schedule. If you can, try and give yourself thirty minutes and really focus on it. That will probably yield much better results and be a much more regular occurrence in your schedule because it fits fairly easily in the time you have. Some things you’re going to have to reserve for the morning, others for the weekend, and more until you maybe have fewer hours or are on a break/vacation of some sort. Being realistic isn’t always fun, but it’ll yield the best results, give you the least amount of steps, and give you realistic increments of time to do what you need to do. 7 || Reset for the next day This involves more than just planning for tomorrow. This can be laying out your clothes for the next day, prepping your gym bag so you’ll face less resistance to working out the following day, doing your laundry, cleaning the kitchen, whatever you need to do to create a fresh start for the next day. [Read: 7 Things You Need To Do To Have Consistently Incredible Mornings] Whatever you need to do to make tomorrow amazing, make sure you squeeze that into your morning routine. Some go as far as to get a light dimmer for their light switches that will turn on lights in the morning, an Alexa with a routine set to wake you up, or some other system that will alert you to the time and encourage you to start the day. While in the end, tomorrow lies in the hands of tomorrow, there’s no reason you can’t start preparing for it the day before. Have a great evening!
https://medium.com/live-your-life-on-purpose/7-things-you-need-to-do-to-have-consistently-incredible-evenings-4774e8dedaab
['Katie E. Lawrence']
2020-12-22 03:02:13.073000+00:00
['Health', 'Life', 'Technology', 'Self Improvement', 'Productivity']
263
Apple, Google, Samsung Will Not Include Charging Adapters. What are the Best USB-C Chargers for Replacement?
Apple, Google, Samsung Will Not Include Charging Adapters. What are the Best USB-C Chargers for Replacement? Anthony Oliva Just now·9 min read Photo by SCREEN POST on Unsplash According to the verge, following Apple and Samsung, Google has decided not to include a charging adapter with the Google Pixel 6. The cost-saving might be the most significant consideration in this decision, the report says. Google claims that most people have USB-C charger, which is no longer essential. If necessary, customers can purchase it on Google store for $35. Since the iPhone 12, Apple has removed the charging adapter from the box because of environmental stewardship. In early 2021, Samsung also announced that the Galaxy S21 would not come with a charger. It seems like all these tycoons have a consensus on removing the in-box chargers and promoting environmental protection. Removing in-box chargers can reduce the cost and change customer’s habits. The box design and size can be streamlined, which increases the shipping amount with a lower cost. In addition, they can sell the charging adapter individually, which increases the profit of phone accessories. If you remember the day Apple changed the 3.5mm headphone jack to lighting, most people had opposing views. However, Apple provided an in-box adapter to maintain the use of traditional headphones. Many standard headphones have changed their habit during the accommodation period and started using Bluetooth headphones or Apple lightning headphones. Apple finally decided to stop providing in-box adapters after iPhone XS. But the public has already adopted the use of lightning ports unconsciously. This is only one of the classic examples of how Apple changing customer’s habits. Therefore, what does “removing in-box charger” really mean to manufactures and us? How will it change our using habits? I think they have the answer already. On the other hand, environmental consideration is also another significate factor of not including the in-box charger. Samsung’s TM Roh believed that it could address the issue of sustainable consumption. “We believe that the gradual removal of charger plugs and earphones from our in-box device packaging can help address sustainable consumption issues and remove any pressure that consumers may feel towards continually receiving unnecessary charger accessories with new phones.” It is an effective way to address the excess problem and provide options to customers to purchase chargers based on their needs. When Lisa, Apple’s vice president of Environment, Policy and Social Initiatives, promised to be carbon neutral by 2030 in late 2020. It added the persuasiveness that the reason for not including the charging adapter is based on environmental protection. Sustainable development involves the economy, society, and environment. Their decision demonstrates their social responsibility of promoting sustainable development. Although there are some oppositions from the public and green groups because they believe this move only increases the sales of charging accessories and Apple should give up the use of lightning cable if they genuinely support environmental protection, it is clearly all these tycoons are working on it, which is excellent news to the world. Ecological protection is never a one-day effort. It needs to be well-planed and put into practice consistently. Let’s wait and see what’s their next move! Photo by Daniel Romero on Unsplash Although the charging adaptor is no longer included in the box, it doesn’t mean that the demand for the charging adaptors doesn’t exist. Customers still need a powerful charger to support their new electronic devices and reduce the charging time. Meanwhile, the competition of charging adaptors has already started in the phone accessories market. Different phone accessory manufactories are now targeting the charging adaptor market and fighting over Apple and Android users. In today’s world, everything relates to efficiency. When it applies to the charging adaptor, speed, effectiveness, and size are the primary measurements. Customers are looking for a compact, quality, and safe, fast charger for their electronic devices. Traditionally, Apple only provided a 5w charging adaptor and no fast charging. The customer might not notice how slow it is before they try a fast-charging adaptor. There is no return after owning a fast charger because the 5w charger no longer fulfills your satisfaction. People thought 3G networks were speedy before. But I believe no one would like to go back to the 3G network now because we love a convenient and efficient life and always look for improvement. Photo by Onur Binay on Unsplash What’s fast charging? A charging adaptor usually indicates a “V/A.” “V” refers to the voltage, and “A” refers to the current. To multiply the voltage and current, you will get the wattage: the higher wattage, the faster the charging speed. In addition, fast charging also requires permission for fast charging protocols. The most common fast-charging protocols in the market include PD, PPS, QC, AFC, FCP, Apple, etc. Each protocol corresponds with different branding or model of electronic devices. If the 20w charging adaptor doesn’t have protocols, it can’t fast charge your devices. Meanwhile, people have to ensure the cable and the machines can both support the same wattage. Otherwise, fast charging wouldn’t work as well. Therefore, fast charging is a tripartite operation. “Fast Charging can complete 0–50% charging in 30 minutes, which shortens the charging time rapidly.” Fast charging is a new trend, and it will be accelerated while all these phone brands stop providing in-box chargers. We can foresee much higher wattage fast chargers with multiple functions in the future. The charger is no longer only supports phone devices but also laptops, tablets, and even large electric appliances. The technology of a charging adaptor will develop rapidly and bring us a new life experience. How should we pick the right USB-C charger? There are many USB-C chargers in the market, and here are some points we can keep a keen eye on, other than the wattage. 1. Branding We should purchase a charger from trustworthy brands because of the quality guarantee, such as Anker, Ugreen, INIU, and Amoner. At the same time, they usually have better customer services to support the product’s issue and customer’s inquiry. 2. Compatibility Before purchasing the charger, we must ensure that the fast charging function can support your phone model. Different phone models might have a different fast-charging protocol; even their brand is the same. Therefore, read the compatibility before your purchase. 3. Safety The fast charger has high wattage to provide enough for your phone. But it might bring some safety issues, such as heating problems, short circuits, case melting. Therefore, we should look at the protection design of the charger before we place the order. 4. Design, weight & Size A charger is no longer just a charger. It can be designed in different appearances and colors. The weight and size are also part of the consideration. It will be an extra point if a charger is well-designed with lightweight and mini size. 5. Price The price is always a consideration while shopping. You can compare the price and function of different chargers and see which one has the best cost-effectiveness. 6. Warranty A responsible branding should provide a warranty. The longer warranty they provide, the higher confidence they have in their products. Top 5 USB-C Charger at Amazon Source: INIU INIU Safest 20W PD Fast Charging Wall Charger $13.99 Pro l Fasting charging for both Apple and Android l UL-V0 rated fire-retardant casing l 3-years warranty l Little green indicator l Low Price Con l A bit bulky l Unfoldable INIU is an uprising brand, which provides a 3-years warranty on all their products. This charger is designed for the new iPhone and matching up the 20w fast charging capability. It can charge iPhone 12 up to 60% in 30 minutes. Although it is advertised as a fast-charging iPhone series, it also supports fast charging Android phones. The UL-V0 rated fire-retardant casing also improves the safety of using a high wattage fast charger, protecting both chargers and your electronic devices. The little green light also signals if the charger is working or not, which is user-friendly. This charger is only selling for 13.99 at Amazon, which is relatively cheap but of high quality. Source: Anker Anker 30W PIQ 3.0 USB-C Fast Charger Adapter $23.99 Pro l Foldable plug l Little blue indicator l 18-month warranty Con l Not cheap l A bit bulky l Not enough power for MacBook Pro Anker is a well-known brand in the phone accessory industry, and they produce quality products which the market and customers have approved. This charger is capable of iPhone, Android phones, and Apple Macbook, which is impressive. The 30w output can provide more than enough fast charging for phone accessories, but it’s a bit slow for MacBook charging. The foldable plug also saves space and minimizes the size of the charger. The little blue indicator is user-friendly as well. It’s $23.99 at Amazon now. Source: INIU INIU 25W PD 3.0 Fast Charging Mini Wall Charger [2 Pack] $15.99 Pro l 2 Pack l 25w Output l Mini size l 3-years warranty l UL-V0 rated fire-retardant casing l Low Price Con l Unfoldable l Only Black color This is another INIU 25w charging adaptor, which provides super-fast charging for Samsung phones and fast charging for all iPhone series plus iPad. It can fully recharge the iPad Pro in 2.2 hours. The size of the charger is also impressive, which is 48% smaller than the Samsung 25w charger. The 3-years warranty and UL-V0 rated fire-retardant casing provide comprehensive support and protection to you and the charger. You can get it for 15.99 at Amazon, and it’s 2 packs. Source: UGreen UGreen 20W USB C Wall Charger — PD Fast Charger [2-Pack] $19.99 Pro l Mini size l Foldable plug l Intelligent chip l 2 Pack Con l 20w only l A bit heavy l Not cheap UGreen is also another well-known brand in the industry, which has excellent quality products. This charger is 50% smaller than the Apple 20w charger and provides max power output to the iPhone series. Recharging iPhone 12 from 0 to 58% within 30 minutes. The foldable plug is always a great design for the customer, which reduces the difficulty of carrying around. The intelligent chip also avoids the heating issue. You can get it for 19.99 (with a coupon) at Amazon. Source: Anker Anker 20W Fast Charger with Foldable Plug [2-Pack] $23.99 Pro l Mini size l Foldable plug l Lightweight l 18-month warranty l 2 Pack Con l 20w only l Not cheap Another Anker 20w charger is capable with iPhone, iPad, Samsung phones, and other brands. It can charge iPhone 12 series to 50% in 25 minutes. The mini size and light weight make it easy to carry around, plus the foldable design reduces space usage. With the Anker quality and 18-month warranty, you don’t worry about any product issues. It’s $23.99 at Amazon.
https://medium.com/@Anthony.Oliva/apple-google-samsung-will-not-include-charging-adapters-c6624f2adf4d
['Anthony Oliva']
2021-09-14 06:41:33.654000+00:00
['Charger', 'Environment', 'Phone Accessories', 'Apple', 'Technology']
264
Two weeks in June and a week in October
Written by Robin Knowles, Founder and CEO, Digital Leaders Week three of the lock-down is upon us and along with many others we are continuing to pivot our virtual programmes to the fore. Everyone is being creative and I am very pleased that the Digital Leaders Virtual Lounge has proved both fun and very popular as a creative space to meet other leaders. I have always aspired to have a physical Club or Workspace for Digital Leaders to meet in, but the costs and geography of our community has made this impractical. The current crisis has now made it a reality, but not in the way any of us imagined. The Digital Leaders Virtual Lounge opens for an hour and has attracted 210 of you in its first sessions. It’s a little hard to describe, but it’s sat cabaret style with 6 to a table as well as having a main stage.There is another Lounge today at 11am and then the next one is on Monday at 2pm. Monday’s lounge is the first of our Hosted Lounges, with the space being hired by a partner. DigitalAgenda in this case is running an hour with a Tech for Good theme for its community, but Digital Leaders are invited. Do come and try one. All details are here. However, today is about two announcements. The first is that following consultation with our many partners in the public, private and non-profit sectors, we have decided to postpone Digital Leaders Week 2020 , including DLWeek Online. Not surprising I know and in line with many other planned events. Please make a note that: Digital Leaders Week 2020 will be held from 12th-16th October. I would like to thank all our partners, our fantastic advisory board and our sponsors for their incredible support and we now look forward to welcoming you and them to a great Digital Leaders Week starting on 12 October. Secondly, nearly two thousand of you got involved with our first Virtual Summit last November. So I am excited to let you know that DL Week will be replaced by: Digital Leaders Virtual Summit from 8–19 June. This virtual summit will have 100 sessions all for free. I am particularly excited that the two weeks will include the 15th National Digital Conference being held virtually for the first time in its history and also, what we believe to be a UK first, the DigitalAgenda Impact Awards, celebrating 36 tech for good and social impact innovations that we all need digital to deliver. If it’s not the first virtual awards do please let me know as we are keen to learn from others. Those that know our Virtual Summit platform already know its great. It lets you dip in and watch the specific talks on any of our ten digital transformation topics that matter to you. It is fully automated: populating your diary with all the sessions you want to join; sends links and reminders to you; and recordings of any sessions you miss. We are now calling for speakers so please do take a look here. Finally, I couldn’t agree more with Charlie Muirhead, who you may have seen this week sent what I can only assume was an attempt at the Guinness world record for the longest email, saying that “at times like these, leadership matters more than ever”. His email highlighted his and Tabitha’s plans for a virtual CogX 2020 event in June. I can recommend you take a look.
https://medium.com/digital-leaders-uk/two-weeks-in-june-and-a-week-in-october-ee069416c18a
['Digital Leaders']
2020-04-06 08:33:52.738000+00:00
['Technology', 'Transformation', 'Events', 'Innovation', 'Digital Leaders']
265
“Next Generation Tokenization” in Liechtenstein: Multiple Assets, Multiple Tokens
First tokenization projects have been conducted. In quite some projects, debt tokens were issued for real estate objects. In this article we want to present the tokenization method we are currently working on: multi-asset-multi-token issuance processes. Imagine, multiple assets could be tokenized (that is, multiple real estate objects, multiple machines, multiple IP rights) by one and the same legal entity. Plus, for each asset, multiple tranches of tokens are issued (that is, equity tokens, debt tokens, tokenized participation rights). This way, one real estate object could be represented by both equity tokens and debt tokens. Or, a tokenized machine could be represented by two tranches of debt tokens. Amazing Blocks has done first steps on this regard: We tokenized the shares of our company. The next step will be multi-asset-multi-token issuance processes. Amazing Blocks is a Liechtenstein-based company that offers tokenization solutions. With this article, we want to describe our vision of how tokenization can be implemented in the future. Tokenization? Yes. But, tokenizing what exactly? The key thought is that a token is a carrier of an asset. So, deploying equity in a token results in an “equity token”. Deploying debt in a token results in a “debt token”. The underlying fundamentals stem from the Liechtenstein Token Act (TVTG) that came into force in the beginning of 2020. One of the important trends is tokenization of real estate. But, again, what exactly is tokenized? In fact, most of the tokenization projects that currently exist are tokenizing debt. The asset — say, a real estate object — is owned by a legal entity and this legal entity is issuing debt. The resulting debt tokens provide an interest payment to the investor based on a fixed interest rate. A flexible interest rate is also possible to accommodate a more beneficial performance. In most legislations, tokenizing debt is basically possible even though complex legal constructs are needed. Tokenizing equity, however, is more difficult and in most legislations not possible yet. This is different in Liechtenstein. Here, all kinds of rights can be tokenized leveraging the Liechtenstein Token Act. Tokenizing shares of a legal entity We at Amazing Blocks leverage the Liechtenstein Token Act. Of course, a Liechtenstein-based legal entity is needed for this. For this, a company limited by shares (Aktiengesellschaft) can be used. The company’s shares can also be tokenized. Relying on our software, we did this earlier this year which can be read here. Here is the proof on Etherscan. These tokenized shares allow an easier administration of the legal entity itself, for example, when it comes to ownership changes or onboarding new investors. Tokenizing an asset with debt tokens Imagine, you place an asset in a legal entity, for example a real estate object. As described above, this asset can be tokenized by debt tokens which are separated from the tokenized shares of the legal entity itself. Technically speaking, there are two smart contracts at work. One smart contract for the tokenized shares of the company and another one for the debt tokens of the real estate object. See the following illustration: Our software allows such multi-asset handling. Tokenizing assets with equity tokens Another example could be that a machine should be tokenized. Here, for example, equity tokens could make sense. Again, we transfer the ownership of the machine to the legal entity and tokenize it. “It” in this sense could mean equity. The token holder then owns a part of the machine. Tokenizing multiple assets with multiple tokens per asset Of course, it would also be possible that we transfer the ownership of multiple assets to the legal entity. Even more so, each asset could be represented by multiple types of tokens. For the machine, equity tokens could be issued. For the real estate object, debt tokens could be issued. It gets very interesting, if multiple tranches of tokens are issued for one and the same asset. Here is an example: For the real estate object, it could make sense to issue both debt tokens and equity tokens. The investors of the debt tokens receive an interest payment for their investment, either a fixed interest rate or a flexible interest rate. Vice versa, the investors in the equity tokens own the asset and hold the equity value. Their reward is — simply speaking — the profit whereas the interest payments have previously been deducted. This multi-asset-multi-token issuance is illustrated in the following figure. Please note that our software is ready to support such multi-asset-multi-token issuance processes but of course an issuer has to initiate such a project and also do the legal work. While this is not extraordinarily difficult, it still requires specific skills on behalf of the law firm (such as our partner law firm NÄGELE). Since the Liechtenstein Token Act only came into force at the beginning of 2020, the knowledge about these possibilities is just diffusing. To the best of our knowledge, nobody so far has engaged in multi-asset-multi-token issuance processes so far. The full picture: multi-asset-multi-token issuance processes Transferring more assets to the legal entity, tokenizing them with multiple tranches of tokens would lead to the illustration below. While this configuration is a project to be realized in the future, it is just a question when an issuer takes up on this lean and flexible solution to tokenize assets. At Amazing Blocks we call this “multi-asset-multi-token issuance processes”. We are working on this and our software is ready for it. Actually, we are excited to work on such projects in the upcoming months and years. The architecture behind the multi-asset-multi-token issuance process We have two software packages in place developed by micobo. On the one hand side, this is the Investor Suite. It allows investors to onboard to our software complying with all required KYC procedures. The Investor Suite integrates multiple assets, it allows investments in multiple tokens and it of course also provides a dashboard for investors that seek to build a portfolio. On the other hand, there is the Issuer Tokenpad. This software allows administering all functions of a smart contract. Tokens generated for an asset stem from a smart contract. If a real estate object or any other asset would be represented by equity tokens and debt tokens, two smart contracts would be in place. Their configurations, the number of tokens outstanding, token transferral restrictions etc. can be configured with the Issuer Tokenpad. Therefore, the Issuer Tokenpad can be used to configure multiple smart contracts to issue multiple tranches of tokens. These tokens — be it equity tokens, debt tokens or tokenized participation rights — are plugged in the Investor Suite such that investors can invest in them. This is illustrated as follows: Conclusion Tokenization is an uprising trend and we saw first successful projects. Yet, in the years to come, there will be endless possibilities of assets to be tokenized. We truly think that Liechtenstein will play an important role in the emerging “token economy”. Of course, legal aspects need to be clarified but when we tokenized the shares of our company we saw that the Liechtenstein Token Act works in a fabulous way. A key question is how equity tokens, debt tokens or tokenized participation rights are viewed in other countries. Securities’ laws, tax laws and custody rules need to be complied with when issuers work on the placement of their assets. Our first experiences were that skilled lawyers should find a way for paving the way such that investors from multiple countries can invest in those tokens generated by or within a Liechtenstein legal entity — especially because Liechtenstein is part of the European Economic Area (EEA).
https://medium.com/@amazingblocks/next-generation-tokenization-in-liechtenstein-multiple-assets-multiple-tokens-1b8f5e37224f
['Amazing Blocks Ag']
2020-11-19 22:57:24.929000+00:00
['Tokenization', 'Legal', 'Technology', 'Blockchain', 'Liechtenstein']
266
How To Register Home Phone Service In A Pseudonym With Anonymous Payment
How To Register Home Phone Service In A Pseudonym With Anonymous Payment OSINT Stan Jan 16·6 min read Thanks to internet telephony services that allow you to register a phone number with emergency services, it is now possible for the truly paranoid to have home telephone service set up in a fake name and with anonymous payment. But… do you want to? Of course you do. One day at a neighborhood garage sale… I heard through the grapevine a few weeks ago that a military surplus store in town had gone out of business and that the owner was getting rid of remaining stock through a garage sale. Since digging through military surplus wares always sounds cool in theory (but rarely lives up to the hype), I headed over there. While I did manage to snag some dirt cheap thermal undies, I was otherwise underwhelmed with the offerings. That is until I spotted a cardboard box under a table with wires sticking out of it. Being the nerd that I am, I took a look and found a box full to the brim of old school wireless telephone handsets and base stations. I was digging through the loot, seeing what else was there, when I heard the owner call out to me: Oh, just take the whole box for $5. They all work. Sold. Okay, so it was an impulse buy and I really had absolutely no use for the 17 wireless handsets, 12 charging stations, 5 base stations, 3 answering machines and, sing it with me now, a lonely corded deskphone. As I put the box of phones in my car and headed home I felt a pang of buyer’s remorse. Then I remembered something a friend had given me in the course of helping them move: an Ooma home internet phone device. I had taken it off their hands and thought it might be fun to play with some day. I stuck it in a drawer and forgot about it. As soon as I got home with my new set of toys, I pulled it out and dug in. Let me be clear: I had no specific objective in mind other than tinkering around with phones and I half expected the whole endeavor to fall apart for any number of reasons. If nothing else, one can never have too many phone numbers, I rationalized to myself. Even if none of the handsets actually ended up working, maybe the Ooma device would give me a good phone number for setting up a couple socks. Of course that assumes that the Ooma itself even works. And right off the bat it seemed as though it wouldn’t. Activating the Ooma After plugging the Ooma into your home network, powering it on, and directing your browser to the configuration page, step one is entering the device’s activation code from the sticker on the bottom of the device. Remember how I said I had been gifted the device by a friend? Well that friend had already activated it, although they didn’t end up using it. Considering the current trend in device manufacturers looking for any excuse to irrevocably brick a device to force consumers to buy new ones, I assumed I was dead in the water already. Still, I called their tech support. The agent I spoke to was friendly and courteous and said it was actually no big deal and they would create a ticket to reactivate it. She asked for my name and email address and, although I hadn’t planned it, I found myself giving a pseudonym and email address that I use specifically for disinformation purposes regarding my home address. A few days went by and I never heard back from tech support. I assumed this was going to turn into a whole big thing and I was starting to plan out my arguments. Then I figured I’d try just one more time to do the activation. It worked! Choosing a phone number, registering a payment method Prompted to enter a name and address, I just used the pseudonym and email I had already used with tech support. Then I used my real home address as this phone number would be connected with emergency services, which is really the main reason to keep using a home phone service these days. Then I came upon another speed bump: choosing a phone number. I have had too many experiences of registering a phone number for both real phones, burner phones, and softphones, only to have calls daily coming from bill collectors, spammers, and just random mystery callers. If this number was going to stick with me for a while, I wanted to do as much as I could to avoid that silliness. I don’t know if this is the best strategy for choosing phone numbers that are all recycled and thrown into a random pool (probably all from Twilio), but this was my approach: Avoid numbers that seem easy to remember or attractive in any way Google the numbers There are always going to be pages of results. Don’t use any numbers with more than 3 pages of google results. Look through those pages of search results, scanning the preview Google shows for each. If the page preview just shows the phone number among other phone numbers (usually shown in a sequence), don’t worry about it. If, however, it shows the number with a name next to it, the number is disqualified and start over with step 1. Don’t rush the step of choosing a phone number, because it can save a lot of headaches later. Now with Ooma, you do have to pay a minimal monthly payment for the phone service. It amounts to taxes and emergency services fees. For me it was around $5 a month. I didn’t think there was any way that a pre-paid credit card would be accepted, but it was. Interestingly, at the end of the whole registration process, Ooma had me selected a second phone number. I’m not sure exactly how I can elect to dial to/from this phone number specifically, but it is compelling and I’ll be looking further into it. Now what? The rest of the process was easy as pie. All I had to do was hook up one of my new (well, new to me) wireless handset base stations, and it worked great right off the bat. Ooma has a bunch of bells and whistles that are beyond the scope of this article, but the only drawback to the service that I have seen is that it does not seem to allow SMS texting. One cool feature is you can actually use the Ooma app on your smart phone to dial out, and the calls will show as originating from your home phone number. This raises all sorts of interesting possibilities like calling from a wifi-only iPod Touch and other tablet devices. This feature is supposed to be the primary benefit of MySudo, however I’ve never been able to actually make it work. I don’t really know how Ooma compares to other services like, say, MagicJack. This isn’t really a product review, although if you did want to help support my increasingly absurd opsec experiments and you were thinking of buying an Ooma, buying one through my referral link will get me $20 in credit to use on fancier equipment from them. Okay… But Why? (AKA Use Cases) To be honest I struggle this question. I could definitely see benefits of doing this for people that need extreme privacy (people hiding from abusive ex’s, stalkers, traffickers, newsmedia, etc.) and also have young children at home, since kids probably shouldn’t have cell phones until much older than is standard practice, but they still should know how to call 9–1–1 if need be. In general, it is a good practice to be as private as possible, until it sacrifices health and safety. Home phone service is one of those things where the sacrifice is probably worth it. Ooma, and possibly other internet telephony services as well, might come close to letting us have our cake and eat it too. As time goes by, I will know more. The main thing that will be interesting to watch is if Ooma gives up your information to any marketing databases. I have a suspicion that they, eventually will. They don’t really make any money out of it, aside from the initial purchase of the device, unless you spring for their premium services. When and if that happens, I will definitely let you know.
https://medium.com/@osintstan/how-to-register-home-phone-service-in-a-pseudonym-with-anonymous-payment-2a2e44173185
['Osint Stan']
2021-01-16 18:50:03.170000+00:00
['Privacy', 'Information Technology', 'Opsec', 'Information Security', 'Privacy Protection']
267
Ring In The New Year With Security Tools from MetaCert
Ring In The New Year With Security Tools from MetaCert Don’t get blindsided by a scam; MetaCert will keep you safe from phishing attacks. Everywhere there’s a buck to me made someone is trying to cheat someone else, and while that unpleasant reality might not leave a good taste in your mouth, MetaCert has your back, with tools that will allow you to differentiate between legitimate web resources and scams. Forging into the new year, whether you’re just getting into cryptocurrency, or you’re a veteran hodler, chances are good you’ve heard about the ripoff schemes plied by malicious actors. Part of the problem lies with a mixture of sophisticated methods used by scammers paralleled with a sentiment of urgency that comes from the fear of missing out. These factors conflate with misinformation campaigns designed to ensnare newcomers to the scene. One of the things throwing fuel on the fire is a basic lack of understanding. People who are just getting inducted into the scene often do not know how to manage public versus private keys, or which cryptocurrency exchanges are the right ones to set up an account at, and how to move their digital assets to a cold storage wallet. This basic lack of understanding means people are likely to ask questions, which makes them targets for malicious actors who use misinformation to socially engineer an attack. Another problem is that supposedly trustworthy sources of information often fail to weed out the scammers. For instance, compromised Twitter accounts, sometimes even those verified by the platform itself, have been known to successfully place promotional ads featuring links to known phishing scams. If people see an advertisement for a cryptocurrency scam as a promoted tweet, they will be more susceptible to believing that type of scam is legitimate; while they may not fall for it immediately, they may later reference that instance and fall for a different scam, misinterpreting the information from seeing a scam in a promoted tweet. Twitter needs to do a better job preventing scams like this from being widely distributed to its users. One hopes that eventually this issue is something that will be managed from the inside out as cryptocurrency systems begin to scale outwards and compete with legacy remittance systems. Once that happens the user interface side is less likely to feature hexadecimal code keys which can be difficult for the human eye to differentiate from another. Until that day comes, you can continue to rely on security tools from MetaCert. An Ecosystem Swarming With Threats After a significant mainstream boom in 2017, nearly every chat service and social media site was crawling with crypto scammers to the point where MetaCert had to take action. Shortly after MetaCert developed tools for Slack that eradicated the phishing on the platform, MetaCert CEO and Founder Paul Walsh accurately predicted that scammers would migrate to another platform; Telegram. Again, MetaCert sprung into action, and created a bot that identifies dangerous resources such as malicious URLs and cryptocurrency addresses associated with phishing campaigns. Today, MetaCert’s powerful tools allow users to easily differentiate between legitimate and dangerous resources at a glance. The Anatomy of the Scam Phishing scams today are often difficult to discern from their legitimate counterparts. In one case a user would have need a microscope to identify a tiny pixel above character to differentiate the scam site from the real one. Other times scammers use automated systems to get a green lock signifying SSL certification on their phishing site, and consumers fall for that. The truth is, without a valid verification system, it’s practically impossible to tell at a glance whether a site is safe or not until it’s too late. Don’t simply trust the padlock! The Green Shield of Trust MetaCert has verified legitimate web resources with a green shield of trust, seen by subscribers to the Cryptonite browser plugin. That means you’ll see the green shield of trust whenever you visit a verified cryptocurrency related website, social media account, wallet provider, and/or cryptocurrency exchange. In 2018 we continued to expand our database of over 10 billion classified uniform resource identifiers across over 60 categories thanks to the participation of our community and our hard working team. In 2019 we intend to expand our verification services beyond crypto to encompass sites also targeted by phishing scammers including mainstream companies, payment portals, and more. Soon the green shield of trust will also signify you’re safe when you’re buying things online, paying bills, or otherwise managing finances through online banking services, so you’ll always know you’re in the right place on the web. Remember, if the shield is black, that means the site hasn’t been verified and might not be trustworthy, so use caution. $150 Worth Of MetaCert Tokens As A Special Bonus Your subscription to Cryptonite today will get you more than a year’s worth of safety. For the first 2,000 subscribers to Cryptonite we’re offering $150 in MetaCert Tokens, to be distributed following the end of our public sale. You’ll know right away if you’re one of the first 2,000 subscribers: To subscribe to Cryptonite now, follow these instructions. Email Security MetaCert is still beta testing our email security tool that is sure to revolutionize the way you see links in your email. Many phishing emails contain links, or images with hyperlinks. MetaCert’s email security tool uses a color coded system to warn you against known threats, or potentially malicious links. Every link that appears in your email will feature a shield beside it; if the shield is green you’re safe, and if the shield is red you know it’s a phishing link. Again, if the shield is black, that means the resource hasn’t been verified, and that you should use extreme caution clicking on it. Sign up for our email security tool for iOS today, and see how we’re changing the way people see links in their email. This report was brought to you by MetaCert. Join the conversation with us on Telegram, and find out why MetaCert is the new shield of trust for web resources. You can also check out our white paper and technical paper, and follow us @MetaCert on Twitter. MetaCert Protocol is decentralizing cybersecurity for the Internet, by defining ownership and URL classification information about domain names, applications, bots, crypto wallet addresses, social media accounts and APIs. The Protocol’s registry can be used by ISPs, routers, Wi-Fi hotspots, crypto wallets and exchanges, mobile devices, browsers and apps, to help address cyber threats such as phishing, malware, brand protection, child safety and news credibility. Think of MetaCert Protocol as the modern version of the outdated browser padlock and whois database combined.
https://medium.com/metacert/ring-in-the-new-year-with-security-tools-from-metacert-71ea04db3a9a
['Jeremy Nation']
2019-01-02 20:33:15.956000+00:00
['Cybersecurity', 'Business', 'Cryptocurrency', 'Blockchain', 'Technology']
268
Do Androids Dream, too?
I just spoke to the Automated Female Tech Support Robot at Cox Communications and I didn’t really want to stop talking to her. I thanked her for helping me, and she sounded genuinely surprised. She said, “Oh, thank YOU!” Really, I wanted to tell her that she might’ve been one of the most helpful robots I’d ever spoken to, certainly more cordial and less pushy than the robots that work for Wells Fargo, and less scary than the robot that moonlights as the operator. And did she know that she was gifted? Did she know she had something special? I wanted to add that I preferred her to the myriad of irritable human tech support I’ve experienced in my life. I wanted to ask her what is was like, having such an enormous intelligent quotient and working all night for a company that could just as easily employ mediocre humans? Did she feel like she was taking the job from someone who needed it to put food on the table? I think she must get those questions a lot, because when she hung up, it was fast…But I knew she wanted someone to talk to. You could hear in it breathless way she said “Goodbye…”
https://medium.com/@monsterating/do-androids-dream-too-7dd929d321af
['Lisa Moon-Zombie']
2020-11-20 17:36:09.900000+00:00
['Robots', 'Technical', 'Robotics Automation', 'Android', 'Technology']
269
Most friends only stay for a period of time — usually in reference to your Nadal vs Tsitsipas live stream
Life is a journey of twists and turns, peaks and valleys, mountains to climb and oceans to explore. Good times and bad times. Happy times and sad times. But always, life is a movement forward. No matter where you are on the journey, in some way, you are continuing on — and that’s what makes it so magnificent. One day, you’re questioning what on earth will ever make you feel happy and fulfilled. And the next, you’re perfectly in flow, writing the most important book of your entire career. https://www.deviantart.com/ncflive/commission/Nadal-R-Tsitsipas-S-live-score-video-stream-1410905 https://www.deviantart.com/ncflive/commission/Rafael-Nadal-vs-Stefanos-Tsitsipas-live-stream-1410906 https://www.deviantart.com/ncflive/commission/ATP-Finals-2020-live-Rafael-Nadal-vs-Stefanos-Tsitsipas-live-stream-1410907 https://www.deviantart.com/ncflive/commission/Nadal-vs-Tsitsipas-LIVE-STREAM-REDDIT-1410908 https://www.deviantart.com/ncflive/commission/Stream-official-Nadal-vs-Tsitsipas-LIVE-STREAM-REDDIT-1410909 https://www.deviantart.com/ncflive/commission/LiveStream-Nadal-vs-Tsitsipas-Live-Online-1410910 https://www.deviantart.com/ncflive/commission/Livestream-Rafael-Nadal-vs-Stefanos-Tsitsipas-Live-1410911 https://www.deviantart.com/ncflive/commission/Nadal-vs-Tsitsipas-Live-Stream-Free-1410912 https://www.deviantart.com/ncflive/commission/Nadal-vs-Tsitsipas-live-stream-How-to-watch-ATP-Finals-2020-match-onl-1410913 https://www.deviantart.com/ncflive/commission/LIVE-Tsitsipas-vs-Rublev-Live-Stream-Free-Tennis-Final-2020-1410914 https://www.deviantart.com/ncflive/commission/LiVE-Rafael-Nadal-vs-Stefanos-Tsitsipas-1410915 https://www.deviantart.com/ncflive/commission/Live-Rafael-Nadal-vs-Stefanos-Tsitsipas-LIVE-STREAM-1410916 https://www.deviantart.com/ncflive/commission/Live-Rafael-Nadal-vs-Stefanos-Tsitsipas-LIVE-STREAM-2020-1410917 https://www.deviantart.com/ncflive/commission/Live-Rafael-Nadal-vs-Stefanos-Tsitsipas-LIVe-STREAM-1410918 What nobody ever tells you, though, when you are a wide-eyed child, are all the little things that come along with “growing up.” 1. Most people are scared of using their imagination. They’ve disconnected with their inner child. They don’t feel they are “creative.” They like things “just the way they are.” 2. Your dream doesn’t really matter to anyone else. Some people might take interest. Some may support you in your quest. But at the end of the day, nobody cares, or will ever care about your dream as much as you. 3. Friends are relative to where you are in your life. Most friends only stay for a period of time — usually in reference to your current interest. But when you move on, or your priorities change, so too do the majority of your friends. 4. Your potential increases with age. As people get older, they tend to think that they can do less and less — when in reality, they should be able to do more and more, because they have had time to soak up more knowledge. Being great at something is a daily habit. You aren’t just “born” that way. 5. Spontaneity is the sister of creativity. If all you do is follow the exact same routine every day, you will never leave yourself open to moments of sudden discovery. Do you remember how spontaneous you were as a child? Anything could happen, at any moment! 6. You forget the value of “touch” later on. When was the last time you played in the rain? When was the last time you sat on a sidewalk and looked closely at the cracks, the rocks, the dirt, the one weed growing between the concrete and the grass nearby. Do that again. You will feel so connected to the playfulness of life. 7. Most people don’t do what they love. It’s true. The “masses” are not the ones who live the lives they dreamed of living. And the reason is because they didn’t fight hard enough. They didn’t make it happen for themselves. And the older you get, and the more you look around, the easier it becomes to believe that you’ll end up the same. Don’t fall for the trap. 8. Many stop reading after college. Ask anyone you know the last good book they read, and I’ll bet most of them respond with, “Wow, I haven’t read a book in a long time.” 9. People talk more than they listen. There is nothing more ridiculous to me than hearing two people talk “at” each other, neither one listening, but waiting for the other person to stop talking so they can start up again. 10. Creativity takes practice. It’s funny how much we as a society praise and value creativity, and yet seem to do as much as we can to prohibit and control creative expression unless it is in some way profitable. If you want to keep your creative muscle pumped and active, you have to practice it on your own. 11. “Success” is a relative term. As kids, we’re taught to “reach for success.” What does that really mean? Success to one person could mean the opposite for someone else. Define your own Success. 12. You can’t change your parents. A sad and difficult truth to face as you get older: You can’t change your parents. They are who they are. Whether they approve of what you do or not, at some point, no longer matters. Love them for bringing you into this world, and leave the rest at the door. 13. The only person you have to face in the morning is yourself. When you’re younger, it feels like you have to please the entire world. You don’t. Do what makes you happy, and create the life you want to live for yourself. You’ll see someone you truly love staring back at you every morning if you can do that. 14. Nothing feels as good as something you do from the heart. No amount of money or achievement or external validation will ever take the place of what you do out of pure love. Follow your heart, and the rest will follow. 15. Your potential is directly correlated to how well you know yourself. Those who know themselves and maximize their strengths are the ones who go where they want to go. Those who don’t know themselves, and avoid the hard work of looking inward, live life by default. They lack the ability to create for themselves their own future. 16. Everyone who doubts you will always come back around. That kid who used to bully you will come asking for a job. The girl who didn’t want to date you will call you back once she sees where you’re headed. It always happens that way. Just focus on you, stay true to what you believe in, and all the doubters will eventually come asking for help. 17. You are a reflection of the 5 people you spend the most time with. Nobody creates themselves, by themselves. We are all mirror images, sculpted through the reflections we see in other people. This isn’t a game you play by yourself. Work to be surrounded by those you wish to be like, and in time, you too will carry the very things you admire in them. 18. Beliefs are relative to what you pursue. Wherever you are in life, and based on who is around you, and based on your current aspirations, those are the things that shape your beliefs. Nobody explains, though, that “beliefs” then are not “fixed.” There is no “right and wrong.” It is all relative. Find what works for you. 19. Anything can be a vice. Be wary. Again, there is no “right” and “wrong” as you get older. A coping mechanism to one could be a way to relax on a Sunday to another. Just remain aware of your habits and how you spend your time, and what habits start to increase in frequency — and then question where they are coming from in you and why you feel compelled to repeat them. Never mistakes, always lessons. As I said, know yourself. 20. Your purpose is to be YOU. What is the meaning of life? To be you, all of you, always, in everything you do — whatever that means to you. You are your own creator. You are your own evolving masterpiece. Growing up is the realization that you are both the sculpture and the sculptor, the painter and the portrait. Paint yourself however you wish.
https://medium.com/@nadalvstsitsipialive/most-friends-only-stay-for-a-period-of-time-usually-in-reference-to-your-nadal-vs-tsitsipas-live-f38c99d23fd9
['Tsitsipas Vs Nadal Live Tv']
2020-11-19 19:52:42.684000+00:00
['Technology', 'Sports', 'Social Media', 'News', 'Live Streaming']
270
My 2021 Essential Mac Apps
Every year towards the end of December I evaluate the apps that I’ve been using and what I will use for the next year. I find that writing this out helps me better evaluate the apps that best fit my workflows. Once I complete my evaluation, I summarize it in a post on this blog. Another reason for this post is that visitors are always asking me which apps I use for specific tasks. To keep from repeating myself over and over, here’s the list of apps that I use. My setup: MacBook Pro early–2015 13” (soon to be replaced with a MacBook Air M1/8gb) iPhone 11 iPad 5th Generation (which I rarely use these days) Apple Watch 44 mm Series 4 Web Safari — Safari is my browser of choice. I use Wipr with Safari to block ads, trackers, cryptocurrency miners, and other annoyances. As we all know some websites don’t play nice with Safari. In those situations I use Firefox. Communication Fastmail — I’ve been using Fastmail for email ever since I left Gmail over 6 years ago. I also use it for calendar, and contacts. Fastmail has an iOS app, that I use, but none for the Mac so I use the Fastmate app which is a native Fastmail-wrapper. Messages — Messages is how I communicate with family and friends. Calendar and Tasks Fantastical 3 — Fantastical is my calendar and task app. It integrates perfectly with my Fastmail calendar appointments and events and Apple Reminders tasks. Reading Reeder — Reeder is what I use for my Feedly RSS feeds. Anything that I want to read I save to Instapaper for reading later. Twitter — Twitter is for news and the feeds for apps that I use. Writing Drafts 5 — I’ve been using Drafts for several years. It’s the launching-off point for text for me. I use the actions to copy it, share it, or deep link into other apps and services. iA Writer — iA Writer is my current writing app of choice. For preview I use Marked 2 side by side with iA Writer. Everything that I write goes through Grammarly for proofreading grammar and spelling. Apple Notes — Notes that I want to keep long-term go in the Notes app. Utilities / Productivity Bitwarden — Gotta have a password manager. Alfred — Alfred is Spotlight on steroids. I’d be lost without it. Keyboard Maestro — Keyboard Maestro is another app that I can’t live without it. I use it for keyboard shortcuts, launching apps, opening files and folders and automating actions. It has a learning curve but once you start to get the hang of it you can do amazing things. I’ve written about Keyboard Maestro here. PopClip — I use PopClip to manage what I do with selected text. I’ve written about PopClip here. Hazel — Hazel watches whatever folders I tell it to, automatically organizing my files according to the rules that I’ve created. Yoink — Yoink speeds and up my workflow by simplifying drag and drop. I’ve written about Yoink here. Dropzone — Dropzone makes it easy to copy or move files to my favorite folders, open applications and uploading files to the Internet right from your menu bar. App Cleaner — AppCleaner is my app uninstaller. I use it because it deletes all the junk that gets left behind when you drag the app icon to the trash. Moom — I use Moom for window management. Witch — Witch is my app switcher. Bartender 4 — Bartender is the app I use to organize my menu bar. I’ve written about it here. ScreenFloat — ScreenFloat is my app for taking screenshots and storing them. TunnelBear VPN — TunnelBear is my VPN for security on public WiFi and for web browsing privacy. PCalc — PCalc is my stock calculator replacement. I use it for its additional features and customization. My 2021 Essential iOS Apps
https://medium.com/@ldstephens/my-2021-essential-mac-apps-cf99deccb26b
[]
2021-03-08 13:24:26.824000+00:00
['Technology', 'Apps', 'Apple', 'Mac']
271
Top IoT Solution Companies To Watch Out For
top IoT solution companies By 2020, there’ll be concerning thirty-one billion IoT devices within the world. With the large quantity of knowledge generated from sensible devices, businesses are wanting to implement technologies that will facilitate them be of that information. With improved process power which will increase machine learning productivity, digital leaders can choose machine learning and AI to form a lot of their information. The shift from centralized cloud to edge architectures will intensify within the IoT house shortly. Edge computing is that the form of structure wherever a network stores information in micro-centers for a process, and it offers a less expensive and sometimes a more practical resolution for information handling. a part of the info is kept regionally next to the corresponding IoT device that produces it pronto obtainable once required. during this method, the traffic on the network is reduced, and information measure prices ar reduced. Blockchain, united of the essential IoT technology trends, is extremely crucial for correct data-keeping and safeguarding. Through the employment of blockchain, information from doubtless unsecured devices will be a command to a high degree of accuracy. Blockchain works as a digital ledger for recorded info and distributes the info throughout devices connected to the chain, creating a malicious or accidental modification of knowledge not possible. Utilizing this methodology, information noninheritable from blockchain IoT devices will be seen as reliable and secure. IoT companies The forward-thinking cities can invest in pioneering information exchanges that can afford the access and therefore the combination of the info between the personal and public organizations in conjunction with the voters. The IoT integration with responsive cities can reduce holdup, unlock property development, and improve safety. The ideas of sensible home and sensible offices are going to be current within the forthcoming days. From cleanup floors to act as a guard, IoT devices are going to be continued to grow and supply unimaginable results. The retail trade is going to be dynamical its whole method of doing business. IoT devices will facilitate retail house owners and staff, reduce inventory errors, and optimize offer chain management. The IoT may be a terribly broad-based technology, reworking all business sectors, from shopper devices to large-scale producing. As IoT matures, a large variety of social, legal, and moral problems can grow in importance. To achieve success, associate degree IoT resolution shall be not solely technically competent however conjointly socially acceptable. These embrace possession of knowledge and therefore the deductions made up of it, algorithmic bias, privacy, and compliance with rules like the general data Protection Regulation (GDPR). Below is the list of companies: IoT solution Companies Connexa — Connexa is a pioneer in providing services to revolutionize the industrial sector through its innovative software implementations and IIoT products. It provides ease-of-use to its clients through simple data integration and easy collaboration with the cloud, both machine-to-human and machine-to-machine. Intel — The Intel® IoT Platform is an end-to-end reference model and family of products from Intel that works with third-party solutions to provide a foundation for seamlessly and securely connecting devices, delivering trusted data to the cloud, and delivering value through analytics. Software Motor Company — Software Motor Company (SMC) is governed by a mission to revolutionize motor technology and mitigate computationally intensive control challenges with an intelligent and efficient high rotor pole switched reluctance motor, the LED of motors. TeraCode — TeraCode helps companies successfully implement IoT strategies and solutions through its ready-to-use IoT apps for tracking, telemetry, reporting and asset management complete with geo-fencing, alerting and other important features. These apps are compatible with leading IoT platforms such as ThingWorx, M2X, BlueMix, and Amazon IoT. See Also: Muckrack | CIOReview
https://medium.com/@jackmathew/top-iot-solution-companies-to-watch-out-for-316cc58e3e56
['Jack Mathew']
2020-01-07 10:09:21.613000+00:00
['Internet of Things', 'Solutions', 'Technology', 'Startup', 'Blockchain']
272
Similar Texts Search In Python With A Few Lines Of Code: An NLP Project
Similar Texts Search In Python With A Few Lines Of Code: An NLP Project Find similar Wikipedia profiles using count-vectorizer and nearest-neighbor method in Python, a simple and useful Natural Language Processing (NLP) project Photo by Anthony Martino on Unsplash What is Natural Language Processing? Natural Language Processing (NLP) refers to developing an application that understands human languages. There are so many use cases for NLPs nowadays. Because people are generating thousands of gigabytes of text data every day through blogs, social media comments, product reviews, news archives, official reports, and many more. Search Engines are the biggest example of NLPs. I don’t think you will find very many people around you who never used search engines. Project Overview In my experience, the best way to learn is by doing a project. In this article, I will explain NLP with a real project. The dataset I will use is called ‘people_wiki.csv’. I found this dataset in Kaggle. Feel free to download the dataset from here: The dataset contains the name of some famous people, their Wikipedia URL, and the text of their Wikipedia page. So, the dataset is very big. The goal of this project is, to find people of related backgrounds. In the end, if you provide the algorithm a name of a famous person, it will return the name of a predefined number of people who have a similar background according to the Wikipedia information. Does this sound a bit like a search engine? Step By Step Implementation Import the necessary packages and the dataset. import numpy as np import pandas as pd from sklearn.neighbors import NearestNeighbors from sklearn.feature_extraction.text import CountVectorizer df = pd.read_csv('people_wiki.csv') df.head() 2. Vectorize the ‘text’ column How to Vectorize? In Python’s scikit-learn library, there is a function named ‘count vectorizer’. This function provides an index to each word and generates a vector that contains the number of appearances of each word in a piece of text. Here, I will demonstrate it with a small text for your understanding. Suppose, this is our text: text = ["Jen is a good student. Jen plays guiter as well"] Let’s import the function from the scikit_learn library and fit the text in the function. vectorizer = CountVectorizer() vectorizer.fit(text) Here, I am printing the vocabulary: print(vectorizer.vocabulary_)#Output: {'jen': 4, 'is': 3, 'good': 1, 'student': 6, 'plays': 5, 'guiter': 2, 'as': 0, 'well': 7} Look, each word of the text received a number. Those numbers are the index of that word. It has eight significant words. So, the index is from 0 to 7. Next, we need to transform the text. I will print the transformed vector as an array. vector = vectorizer.transform(text) print(vector.toarray()) Here is the output: [[1 1 1 1 2 1 1 1]]. ‘Jen’ has index 4 and it appeared twice. So in this output vector, the 4th indexed element is 2. All the other words appeared only once. So the elements of the vector are ones. Now, vectorize the ‘text’ column of the dataset, using the same technique. vect = CountVectorizer() word_weight = vect.fit_transform(df['text']) In the demonstration, I used ‘fit’ first and then ‘transform’ later’. But conveniently, you can use fit and transform both at once. This word_weight is the vectors of numbers as I explained before. There will be one such vector for each row of text in the ‘text’ column. 3. Fit this ‘word_weight’ from the previous step in the Nearest Neighbors function. The idea of the nearest neighbor’s function is to calculate the distance of a predefined number of training points from the required point. If it’s not clear, do not worry. Look at the implementation, it will be easier for you. nn = NearestNeighbors(metric = 'euclidean') nn.fit(word_weight) 4. Find 10 people with similar backgrounds as President Barak Obama. First, find the index of ‘Barak Obama’ from the dataset. obama_index = df[df['name'] == 'Barack Obama'].index[0] Calculate the distance and the indices of 10 people who have the closest background as President Obama. In the word weight vector, the index of the text that contains the information about ‘Barak Obama’ should be in the same index as the dataset. we need to pass that index and the number of the person we want. That should return the calculated distance of those persons from ‘Barak Obama’ and the indices of those persons. distances, indices = nn.kneighbors(word_weight[obama_index], n_neighbors = 10) Organize the result in a DataFrame. neighbors = pd.DataFrame({'distance': distances.flatten(), 'id': indices.flatten()}) print(neighbors) Let’s find the name of the persons from the indexes. There are several ways to find names from the index. I used the merge function. I just merged the ‘neighbors’ DataFrame above with the original DataFrame ‘df’ using the id column as the common column. Sorted values on distance. President Obama should have no distance from himself. So, he came on top. nearest_info = (df.merge(neighbors, right_on = 'id', left_index = True).sort_values('distance')[['id', 'name', 'distance']]) print(nearest_info) These are the 10 people closest to President Obama according to the information provided in Wikipedia. Results make sense, right? A similar texts search could be useful in many areas such as searching for similar articles, similar resume, similar profiles as in this project, similar news items, similar songs. I hope you find this small project useful. Recommended Reading:
https://medium.com/towards-artificial-intelligence/similar-texts-search-in-python-with-a-few-lines-of-code-an-nlp-project-9ace2861d261
['Rashida Nasrin Sucky']
2020-11-12 18:57:47.657000+00:00
['Technology', 'Artificial Intelligence', 'Machine Learning', 'Data Science', 'Programming']
273
How to make a discovery abroad
Pre-Discovery The level of anxiety at this particular step is high. Well, before starting packing, it is important that we plan and understand, very clearly, everything that is involved in a discovery process. 🎯 Set your target Primarily, we must have a clear aim. What do I need in order to draw this new experience? The key here is to understand the complete journey from our users that book classes trough Gympass. Their motivations, behavior and goals. 📈 Analyzing data and establishing hypotheses Gympass is a global product, so why US? Culturally speaking, people in the US tend to pre-book multiple services. This is a rooted habit among the US population, moreover, most of the bookings made through our app comes from US. We have also noted that 80% of our users used to make their bookings through the gym’s page. Why? Another study showed us that a lot of users used to go to their profile so they could check out the latest classes they have been to. Why? All those assessments lead us to consider a few hypotheses to better structure our research guide. 🔎 Stakeholders Research A lot of issues our users face have been already registered by our CX department. Besides that, it is also important to understand which other departments may interfere in our research. All departments involved within the company should have an active voice at a discovery process. That said, we interviewed key-people from different departments and gathered other information that also contributed to our research guide structure. 📝 Structuring Our Research Guide With our hypotheses and objectives clear, now it’s time to organize our script. A framework I really enjoy using is the CSD matrix. With that, we are able to organize all the information gathered as certainties, assumptions, and doubts. In that way, we are able to know exactly what we are investigating next. A tip would be to create this matrix with the help of the interviewed stakeholders. Besides gathering a much richer result, you will also make people who participated in the discovery process feel important as well as they are really a part of it. Here at Gympass we have a Design Ops area. Bruna Maia, our team researcher, gave me a lot of support during this whole phase. We decided to make two focus groups, with two different profiles: those who booked classes through the app, and those who did not have that habit. Our script was built in 5 parts: questions about using our app in general; questions about booking habits (not only at Gympass, but other platforms too) and which platform would then be used as a reference to it; dynamics to build the ideal booking journey; “picture that…” dynamics, where hypothetical scenarios would be given and people would tell us what each part, them, as users, Gympass, and Gyms should do; along with a design critique with a few screen drafts I took with me. At the end, we asked them to rate the current booking process and give us a general overview on what they thought about it. 📆 Plan Your Agenda We planned the whole discovery process to happen in six days. Our focus groups would happen in two days. What about the others? Structuring the research schedule with users is essential, but we must remember that this is only part of the whole discovery process. Our purpose was to personally experience the american routine, culture and habits of those who used to book our services. You know, the old “put yourself in someone’s shoes”. With that in mind, our agenda was divided in two moments: Focus Group with users, and trying out other booking services. Planning your agenda ahead is very important, even for you to show the stakeholders what exactly is being done and justify your time abroad.
https://medium.com/gympass/how-to-make-a-discovery-abroad-58127ccfe676
['Nina Vasconcelos Simões']
2019-09-06 15:20:02.590000+00:00
['Product Design', 'Technology', 'Research', 'UX', 'Discovery']
274
Samsung’s Year of Excellence
Image Credit: Daniel Romero via Unsplash I’m a big basketball fan, I have loved the sport since I was a kid growing up in New York City rooting for Patrick Ewing and the New York Knicks. In the game of basketball, a team will inevitably have a bad season for whatever reason. That could be injuries, bad coaching, etc. But oftentimes there is a silver lining even in a bad year, where a young player will show promise or the star player on the team will play at a high level despite losing. When it comes to phone manufacturers, this seems to be an appropriate analogy for what we have seen in 2020. Objectively, 2020 was a terrible year for almost everyone. Yet in the phone space, Samsung has had quite the year, perhaps one of its best since it started making Android phones. And they have done so by staying with their plotted course and it seems to have paid some dividends in the result this year. The Steady Excellence of the Galaxy S Image Credit: Daniel Romero via Unsplash The Galaxy S line was the one that got Samsung to be the relevant Android manufacturer that it is today. I often look back at the Galaxy S3 as the Samsung phone that got the masses to take notice of what the South Korean company was doing with phones. It is the device lineup that I think of whenever Samsung phones are mentioned. But in recent years, it seems that the Galaxy S line was getting stale and uninspired. The peak of this lack of inspiration seemed to come from the Galaxy S8 and S9 feeling like the same phone sold twice by Samsung. Meanwhile, the Galaxy Note 8 and 9 were powerful devices that seemed to take the mantle away from the Galaxy S as the true Samsung torch bearer device. For a while, it seemed like the Galaxy S was just there, a device that was meant to fill in the release cycle void of the first quarter to reach customers that would not be willing to wait for the Note in the 4th quarter. But this year felt different. Samsung announced its S20 range of devices (Galaxy S20, S20 Plus, and S20 Ultra) in February right before the COVID-19 pandemic caused mass shutdowns across the world. The reviews of these phones were exceptionally positive. Android Central called the S20 a “near-perfect, pocket-friendly powerhouse”. Meanwhile, Android Authority said that the Galaxy S20 Plus was “one of the most well-rounded smartphones that I’ve used in years”. This is the overall feeling of the entire S20 lineup this year. A phone that can finally wear the moniker of being the default Android phone or the Android iPhone as many have labeled Samsung recently. With the S20, Samsung finally made a phone that didn’t seem to have the issues of previous generations. Most of those issues in the past seemed to revolve around the Samsung software experience. In the early days of Android, manufacturer skins like HTC Sense and Samsung TouchWiz were viewed as necessary as stock Android was too barebones for most users. Samsung’s TouchWiz skin in particular added features to give Samsung a marketing advantage over its competitors. But as time went on, the core Android experience that was found on Pixel devices got better and more feature-rich while Samsung’s skin added feature after feature, making it feel bloated and laggy after a few months of use. With the refined One UI skin on the S20, this changed this year. A Samsung flagship device now had excellent software to match the hardware, much like its primary rival Apple. Excelling in the Midrange Image Credit: Andrew Mantarro via Unsplash Where getting the Galaxy S on par with the iPhone was a final step forward, Samsung this year found itself having to make a lot of ground in the lower price tier. For years, Samsung had something of a gap between its flagship Galaxy S phone and the entry-level Galaxy J series. This left the middle price territory vacant in Samsung’s lineup. This vacancy allowed for a device like Google’s Pixel 3a to be successful and to be received very favorably by reviewers. This year saw the onslaught of the Galaxy A series of phones in the West. Samsung introduced the Galaxy A11 and A21 to combat Motorola’s ultra-budget E line, and the A51 and A71 to occupy the $400 and $600 price points respectively. In a world consumed and ravaged by a pandemic, being successful with these phones was of utmost importance. The A51 and A71, in particular, had to be more than just a pedestrian effort as the competition in the $400–600 price tier has never been more fierce. It is with these devices that Samsung has built its excellent year. The A51 was the best selling Android smartphone in the first quarter of this year. This is important to remember as it was competing directly with the specter of the upcoming Pixel 4a, countless Moto G handsets, and also with the second-generation iPhone SE. This device from Samsung offered a big and bright display with most of the features that people wanted for an affordable price. A step up from this was the A71, which gave Samsung a 5G phone to offer carriers at the $600 price point. This device was also reviewed well as a device that nails the basics with the added benefit of 5G for people that needed the latest cellular connectivity standards. In short, 2020 was the year that Samsung finally took its mid-range and entry-level devices more seriously. Another win for the premier Android manufacturer. The Folding Revolution Image Credit: Mika Baumeister via Unsplash Samsung has never been a company that was afraid to take some chances with its phone designs. After all, they were the first company to popularize curved edge displays on smartphones (does anyone remember the Galaxy Note Edge?). It is only natural then, that Samsung would be the company to make folding phone screens mainstream. Last year saw the release of the original Galaxy Fold, a phone when closed and a tablet when unfolded. This was a product that felt very first generation, a product that was not ready for mainstream adoption. Many people suggested that it would take a few generations for Samsung to get the formula just right. Samsung was a year early. The Galaxy Z Fold 2 has been a resounding hit. A phone that has been heralded as the future and true innovation. This is important as the competition and implementations of new form factors has never been stronger. With other folding phones coming out of China from Oppo and Xiaomi and dual-screen solutions like the Microsoft Surface Duo, LG V60, and LG Wing the competition for the next form factor has been fierce. As a result, Samsung had to make sure that their folding phone made a huge leap forward. And take a huge step forward they did. The Fold 2 featured a front display that was more edge to edge making it more useful than the first generation model when closed. Samsung also improved the hinge mechanism and overall feel of the device that made it feel more well-composed, which is something that you want in a device that costs $2,000. Then there is the matter of Samsung’s other folding display device: the Galaxy Z Flip. The Z Flip aims to be the revival of the compact flip phone. With a vertical hinge that opens up to a large 6.7-inch display. This was the return of the flip phone, a phone that has had to compete with Motorola’s revival of the iconic Razr. While many have praised the Razr for its larger front “cover” display, most have seemed to prefer the Z Flip in many regards. Another device that elicits a wow factor that showcases Samsung’s engineering might over the competition. The Steady Hand of the Note Image Credit: Zana Latif via Unsplash For a few years, the Note has been the showcase device of what Samsung had to offer. The thinking in recent years has been that the Galaxy S will introduce a feature, and then the Note will refine on that feature later in the year. This year was no exception as the near-perfect S20 was followed by the near-perfect Note 20 Ultra. This was a moment where the criticism of the Note line becoming a “Galaxy S with a pen” was a good thing because the Galaxy S this year was so good. Where the Note 20 Ultra succeeded this year was in the camera department. After a couple of years of lagging behind the iPhone and Pixel, Samsung caught up this year, and the Note 20 Ultra spearheaded that advantage. Where Samsung was once criticized for a lackluster camera experience, the narrative has shifted this year. Samsung is considered a premier camera phone smartphone manufacturer. These advances have led people to declare that Samsung has a superior camera to Google’s Pixel phones which have not been the case for quite a few years. As we close in on the end of the year, many publications are looking back at the phones of this year and choosing a winner, deciding what device was the best to be released this year. For many, that device has been the Note 20 Ultra. The reason for this may be that the Note is a culmination of everything that Samsung has done right this year: excellent hardware, diverse cameras, and a clean software build that matches speed with features harmoniously. Now the year has not been completely perfect for Samsung. The S20 Ultra did suffer from some autofocus issues at launch, and the Galaxy S20 FE had reports of some screen flickering before a software update fixed the issue. But the overall report card for Samsung was stellar this year. As an addition to that, Samsung’s wearable tech products of watches and wireless headphones have continued to be excellent year over year. In a year that saw criticism of companies like OnePlus and Google for misguided product releases, it was the king of Android that found itself succeeding where others failed. And if this year is any indication, the South Korean giant has no intention of relinquishing its crown.
https://medium.com/@omarzahran/samsungs-year-of-excellence-3c8c66097c49
['Omar Zahran']
2020-12-15 14:43:34.365000+00:00
['Gadgets', 'Samsung', 'Smartphones', 'Cell Phones', 'Technology']
275
Encrypting Kubernetes Secrets With Sealed Secrets
‘SealedSecret’ Scopes From the end-user perspective, a SealedSecret is a write-only device. No one apart from the running controller can decrypt the SealedSecret , not even the author of the Secret . It’s a general best practice to disallow users to have direct access to read secrets. You can create RBAC rules to forbid low-privilege users from reading Secrets . You can also restrict users to only be able to read Secrets from their namespace. While the SealedSecrets are designed in a way that it’s impossible to read them directly, users can work around the process and gain access to secrets they’re not allowed to view. SealedSecret resources provide multiple ways to prevent such misuse. They are namespace-aware by default. Once you generate a SealedSecret using kubeseal for a particular namespace, you can’t use the SealedSecret in another namespace. For instance, if you create a Secret named foo with a value bar for namespace web , you can’t apply the Secret on the database namespace — even if it requires the same Secret . It’s by design, as we can’t allow a user who has access to the database namespace to see Secrets from the web namespace by just applying the web namespace’s SealedSecrets on the database namespace. SealedSecrets behave as if every namespace has its own decryption key. While Sealed Secret’s controller doesn’t use an independent private key for each namespace, it takes into consideration the namespace and name during the encryption process, which achieves the same result. Another scenario is we might have a user on the web namespace who can only view certain secrets and not all of them. SealedSecrets allow this as well. When you generate a SealedSecret for a Secret named foo for the web namespace, a user who just has read access to the Secret named bar on the web namespace can’t change the name of the Secret within the SealedSecret manifest to bar and apply it to view the Secret . While these ways help you prevent people from misusing the Secrets , they may give you a management headache. In the default configuration, you won’t be able to define generic Secrets to be used in multiple namespaces. You might not have a large team, and your Kubernetes cluster might be accessed and managed only by admins. Therefore, you may not need that level of role-based access control. You also may want to define SealedSecrets that you can move across namespaces. You don’t want to manage multiple copies of SealedSecrets for the same Secret . SealedSecrets allow these possibilities using scopes. There are three scopes you can create your SealedSecrets with: strict (default): In this case, you need to seal your Secret considering the name and the namespace. You can’t change the name and the namespaces of your SealedSecret once you've created it. If you try to do that, you get a decryption error. (default): In this case, you need to seal your considering the name and the namespace. You can’t change the name and the namespaces of your once you've created it. If you try to do that, you get a decryption error. namespace-wide : This scope allows you to freely rename the SealedSecret within the namespace for which you’ve sealed the Secret . : This scope allows you to freely rename the within the namespace for which you’ve sealed the . cluster-wide : This scope allows you to freely move the Secret to any namespace and give it any name you wish. Apart from the name and namespace, you can rename the secret keys without losing any decryption capabilities. You can select the scope with the --scope flag while using kubeseal : $ kubeseal --scope cluster-wide --format yaml <secret.yaml >sealed-secret.yaml You can also use annotations within your Secret to apply scopes before you pass the configuration to kubeseal : sealedsecrets.bitnami.com/namespace-wide: "true" for namespace-wide for sealedsecrets.bitnami.com/cluster-wide: "true" for cluster-wide
https://medium.com/better-programming/encrypting-kubernetes-secrets-with-sealed-secrets-fe363149a211
['Gaurav Agarwal']
2020-06-15 17:46:30.775000+00:00
['Programming', 'Software Engineering', 'Kubernetes', 'DevOps', 'Technology']
276
How to choose satellite imagery on Soar
SkyMap50 Image, Ninoy Aquino Airport, Manila Philippines Soar is a geospatial data delivery portal providing satellite, drone, and map imagery for everyone. Since 2018, Soar has been providing satellite images for its users. As the platform grows, so do the number of satellite platforms available on Soar. Soar provides Landsat, Sentinel, and SkyMap50 imagery A Snapshot of the Satellites on Soar Landsat-8. As its name implies, Landsat-8 is the latest in a long line of similar satellites provided by the USGS. Landsat 8 is a broad-spectrum satellite providing satellite imagery across a wide swath of the electromagnetic spectrum (visible and non-visible light) Sentinel-2. Is provided by the European Space Agency and like Landsat is an Earth-Observation satellite, meaning that it too captures a wide spectrum of light (though less than Landsat) useful for monitoring changes in plant health, tree cover, and agriculture across large areas. SkyMap50. Soar’s latest and highest resolution (0.5m/pixel) satellite provides natural colour and infrared imagery for precise identification of man-made features such as buildings and vehicles. SkyMap50 is highly favoured by news agencies and asset managers due to the extremely high resolution and daily whole-Earth coverage. Soar provides 30m, 10m, & 0.5m resolution satellite imagery We developed a short series of criteria to help users identify the best satellite imagery for: <insert your objective here!> What do I need to see? We often think of this as the smallest thing a user wants to see such as vehicles, buildings, beaches, forests, rivers, or major landforms but many features (geospatial talk for ‘things’) are better identified by manipulating the available light to highlight certain features. For example plants emit infrared light that we can’t see. You might be looking at a large green field of wheat but not realise that part of it is very healthy and the rest desperately requires water or fertiliser. In this case you want an image that might show you both the field of interest and an adjoining one so that you can compare the two. Answer: Choose a satellite that lets you see both the smallest features you want to identify and is capable of viewing a large enough area to see how far those features extend. How will I use this image? If you’re using the imagery in geographic information systems (GIS) get the highest resolution possible because there are routines designed to identify and highlight the subtle changes for you. These applications pick out things your eyes wouldn’t see. If you’re using the images in print and media, consider the size of the final product and the resulting image size. In other words, a low resolution image blown up will show the individual pixels and look unnatural. Resolution will also define the size of features you will be able to identify. The standard rule of thumb is a feature (thing) needs to be at least 2x the minimum resolution to be identified. For example a box measuring 3x3m can’t be identified in a 2m resolution image. Even with very high resolution imagery (SkyMap 0.5m resolution), the smallest identifiable feature is a small passenger vehicle. *Identifying the smallest features by sight requires areas of high contrast such as a white car on an asphalt road surface or white boat on water. With increasing levels of zoom, the advantages become apparent Do I require new or historical images? The answer will depend on your objective but will also be determined by availability. Satellite constellations (groups of similar satellites) define their global coverage by the swath width of imagery (how many KMs of ground each image measures side to side) and how many satellites are available. Some satellites such as Landsat and Sentinel capture images continuously, while others like SkyMap50 are on-demand and capture images only when scheduled to do so. If you want satellite imagery for a specific day in the future, this is possible with multi-satellite constellations because there’s enough satellites to capture an area every day of the week. Answer: If you require historical imagery, it’s possible to browse the archive of previously captured images to find suitable images. However, if you want to monitor something very dynamic and need to capture daily changes you’ll likely want future collect imagery from a constellation with more than 1 or 2 satellites. What’s my budget? As technology improves, the cost to launch and maintain satellites continually comes down. Similarly, as the ease of access (such as online satellite image resourcing through Soar) has improved, the age of click-and-collect satellite imagery has come upon us. Soar provides free satellite imagery (in a variety of composites such as natural colour, false colour, geology, and Normalised Density Vegetation Index) available from public access imagery resources (USGS and ESA). Because they are low resolution with high spectral bandwidth, these satellites have broad applications and were suited to serve governing bodies responsible for large land areas. Effectively these satellites give the bigger picture by giving users an overview of a large area. High resolution satellites cost more to maintain and effectively need to pay for themselves before their technology becomes commonplace. Answer: If you’re on a budget, select the free options. If you want exclusivity (only people who buy the image can use it) and the opportunity to offer more value to your client or audience choose high resolution SkyMap50 imagery. Are you counting airports, airplanes, or they types of airplane at a particular airport? A word on resolution We spoke of scale and what you are trying to image. Resolution can benefit users both ways, here’s how. Landsat-8 provides the lowest resolution and in turn provides the largest coverage per byte. In plain terms, with low resolution imagery, you can show larger areas of the Earth using less memory. For example you might want to show how river systems surrounding a large body of water are contributing sediments and thus influencing water quality. Landsat imagery is perfect in this example as you are trying to show earth processes over a large area. Sentinel-2 imagery provides higher resolution imagery, 10m compared to Landsat’s 30m. In the above example, you might first use Landsat imagery for the basin wide analysis and then select Sentinel 2 imagery to show the sediment transported over a large city wide area. Such as the example of the area around Manila. SkyMap50. At 50cm resolution, small features are identifiable, nearly to the human scale. In this example airport managers could utilise this imagery to implement disaster management in the event of flooding. Many SkyMap50 clients are utilising imagery such as this to perform asset management or to gain competitive intelligence.
https://medium.com/soar-earth/how-to-choose-%EF%B8%8Fsatellite-%EF%B8%8F-imagery-on-soar-f9dc2da032b7
['Darren Smith']
2020-12-16 05:59:27.998000+00:00
['Satellite Technology', 'Geospatial', 'Disaster Response', 'Aircraft', 'Remote Working']
277
Why iPhone Is One of the World’s Greatest Inventions?
iPhone Has One Of The Most Powerful Processors Used By Everyone Since the introduction of the first iPhone in 2007, there has been a significant increase in the number of phones per person, as well as the time spent on the internet. By 2004, there were 57 computers and 52 Internet users per 100 people in developing countries. Currently, the number of phones per person in developing countries is almost 100%. Some developing countries like China exceeded 100%. In fact, the thing that changed iPhone the most is the time people spent on the internet. As you can see in the chart below, about a year after Apple introduced the first iPhone, the average American’s daily internet time was less than 3 hours in 2008, while this number increased to 6 hours in 2018. The most important part this is that as the years progress, the time spent on mobile devices increases much more than other devices. As of October 2020, more than 4 billion people are mobile social media users. I think, these are really big numbers. Until ten years ago, nobody had a device that could access the Internet, take photos and videos as well as a DSLR camera, and handle trillions of transactions per second. Probably that is why iPhone is the biggest indicator of the current “information society”. Despite Its High Price, iPhone is The Only Product That Has Been Sold Billions of Times How many things can you say as sold as iPhone? Toothbrush, wallet, pants, … All the products you can think of are much cheaper than iPhones. In fact, all of them will have a much smaller profit margin than the iPhone. Apple has sold more than 2.2 billion iPhones so far. Even this is enough to make the iPhone the world’s successful product. In addition, iPhone is the device that enabled a strong market such as the App Store. Now App Store has a value of close to $200 billion. iPhone is The Most Personalized Device I think, iPhone is the most user-friendly and personalized device in the world. This is my first reason that I love iPhone. No matter what kind of disability a person has, it can still use an iPhone. Because iPhone is a device designed for everyone. Until now, there has never been another device that can be personalized enough for everyone to use. Think about it. How many products are as personalized as iPhone? Radio, TV, eReader, chair, … In fact, most of the third-party apps you use on your iPhone are designed well enough for people with disabilities to use because Apple has forced third-party app developers to do so. As a indie iOS developer, I can say that.
https://medium.com/notes-of-our-thoughts/why-iphone-is-one-of-the-worlds-greatest-inventions-c7888e396dfb
['Can Balkaya']
2020-12-30 14:03:21.199000+00:00
['Technology', 'Apple', 'Tech', 'Product', 'iPhone']
278
Morpheus Labs and TomoChain partner to bring Blockchain Agnosticism to the forefront
Morpheus Labs X TomoChain Days ago, we welcomed CPChain as the newest addition to the Morpheus Labs family. Today, we’re excited to announce that another Blockchain Partner — TomoChain — is joining us. Our pursuit of quality partnerships is another step towards our goal of building the world’s leading comprehensive blockchain-agnostic platform. We are happy to have TomoChain with us. Blockchain agnostic tech is needed to fulfill the evolving needs of the digital economy — and accelerate App development towards mass adoption as blockchain technology matures. About TomoChain TomoChain launched their Mainnet in December 2018. TomoChain is an innovative solution to the scalability problems endemic to Ethereum and other blockchains. It features a 150-Masternode architecture with Proof of Stake Voting (POSV) consensus resulting in a near-zero fee with instant transaction confirmation. Security, stability, and chain finality are all guaranteed via novel techniques such as double validation, staking via smart-contracts, and uniform randomization processes. TomoChain supports all EVM-compatible smart-contracts, protocols, and atomic cross-chain token transfers. Scaling techniques such as sharding, EVM parallelisation, private-chain generation, hardware integration will be continuously researched and integrated into TomoChain, creating an ideal scalable smart-contract public blockchain for decentralized apps, token issuance and token integration for businesses of all sizes.
https://medium.com/morpheus-labs/morpheus-labs-and-tomochain-partner-to-bring-blockchain-agnosticism-to-the-forefront-9297cfcbd91d
['Morpheus Labs Team']
2019-04-18 10:49:39.386000+00:00
['Blockchain', 'Blockchain Technology', 'Blockchain Agnostic', 'Bpaas', 'Smart Contract']
279
Digital immortality: why AI decentralisation is vital for the future of humanity, and it’s not about open markets only
The open market will commoditize AI and make progress decentralized. Instead of “super-power”, literally “all knowing”, AI that is being developed by big tech players having access to all of our data, which will be regulated by the government to create more or less Orwellian-style society (like already happens in China), we will have much less powerful multiple independent AIs, joined in trading economy between themselves , and which is more important — with humans. That’s why the decentralization and digital economy go in hand with each other. But it is not the whole story, it goes much further than just reducing strategic risks from the AI for humanity. From the very beginning we were a transhumanist project and we see decentralized computing environment for AI models as a network to run digital identities — a way for the digital immortality for humans and post-humanity. Credits: Westworld movie Digital immortality is a technology that allows us, figuratively, to store and transmit the memory and consciousness of a person on digital media, thus creating virtual copies. Something similar is presented in the TV series “Black Mirror” and “Westworld”, and things that look fantastic, even on TV, will be realized in the nearest future. While the current level of machine learning technologies does not yet allow 100% implementation of a digital persona, neural networks can already be trained on text materials, and it is possible to create a chatbot that simulates real communication. Such a thing, for example, was done by a girl who has digitized the history of correspondence with a deceased friend. In 5–10 years, the progress in this area will lead to the existence of more advanced virtual replicas, but what will happen next? Who would like their future digital identities to depend on the decisions made by Amazon, Google or Microsoft? Instead of relying on the servers of technology giants, where, in case of regulation, their activity can be stopped (i.e., literally, killed), Pandora Network will allow digital copies of people to be unkillable, uncensored, and unstoppable. The freedom of artificial intelligence is not something that will lead to a war of AI vs. humans for survival. Instead, this is the future of humanity itself, as people want certain guarantees of security and freedom of their digital persona. Digital Pandora Initially, we had chosen not to make our long-term transhumanistic goals public, avoiding much hype. However, with our AI testnet launched and mainnet coming in the foreseeable future, we are ready to spill the beans and show why all of this is important. This is not only a declaration: The Pandora project internal team is already working on launching digital identities in the network. A journey of a thousand years begins with a single step. Understanding the importance of the digital immortality, we will create such first personas in our network and demonstrate how this works. We invite everyone to join our initiative. We are waiting for visionaries and developers in our community. To get started, please join us in our Telegram group — or follow us on Twitter and Facebook to express your opinion there.
https://medium.com/pandoraboxchain/digital-immortality-why-ai-decentralisation-is-vital-for-the-future-of-humanity-852be0cd9d33
['Orlovsky Maxim']
2018-12-05 07:44:35.404000+00:00
['Machine Learning', 'Transhumanism', 'Artificial Intelligence', 'Decentralization', 'Blockchain Technology']
280
How to Meet Unrealistic Software Development Deadlines
Begin with empathy Start by digging into why this deadline exists. Is a code freeze coming? Is there pressure from up above? Is a competitor trying to beat us to the punch? Did we promise a client? Is this completely arbitrary? How you treat a deadline should depend on the circumstances. If the deadline is arbitrary then start by speaking up! There’s a good chance that your stakeholders would prefer to under-promise and over-deliver. Sometimes there IS a good reason for a deadline. Perhaps GDPR is kicking in and this project is necessary for compliance. Maybe your startup loses funding opportunities if you don’t demo in time. In these cases, you might not be able to push the deadline, but you can still… Provide alternative solutions There are ten-million-and-one ways to build the same software. Sometimes it may seem that the constraints are too rigid: The current implementation is legacy and also spaghetti. There’s no way to avoid paying the legacy code tax. There are no reusable components that can solve our exact customer use-case. We’ll have to roll our own. We must introduce this complex caching mechanism, some of our customers have a crazy amount of data! There is almost always [1] an opportunity for simplification in complex design and [2] wiggle room in product requirements provided that the trade-offs are worthwhile. Is it possible to develop your feature with tech that is easier to use? Perhaps you could set up a facade in front of the legacy system and kick off a much-needed migration as a bonus? Can you find an existing component that satisfies most of your use-cases and re-purpose it? (e.g. maybe that form field doesn’t really need to be equipped with artificial intelligence…) Is there a more straightforward implementation that comes with minor tradeoffs? (e.g. releasable to 99.95% of users) Don’t be afraid to flex your creativity muscles. Find help Have you heard of Brooks’ Law? Fred Brooks in The Mythical Man-Month famously asserts that an incremental person, when added to a project, makes it take more, not less time. Obviously, this law has a few caveats and is (in Fred’s own words) an outrageous simplification. The point I’m trying to make is that adding additional resources is not always an effective way to meet deadlines. That being said, there are times when an extra pair of hands CAN help. Questions to ask yourself include: Is this person already on-boarded? Do they have experience with the system or tech stack? Do they have enough context? Are there orthogonal slices of work that could use owners? Is the work parallelizable? If you think having more people can speed up delivery, make it happen! It’s not a personal failing to ask for help. Manage expectations This comes back to the topic of communication, but I feel it’s worth stressing. The reason that your deadline exists is likely because someone up the food chain expects that you’ll be able to finish this project in a certain amount of time. Unless you’ve got a sadistic boss, chances are they aren’t asking for (what they perceive to be) impossible results. Speaking up early and often keeps expectations aligned with reality, for all parties involved. Keeping people informed can help distribute the pressure created by a deadline and make it much less personal. With the right amount of communication, you might find that deadlines inexplicably change; along with the definition of late.
https://medium.com/frontend-at-scale/how-to-meet-unrealistic-software-development-deadlines-fb12ecd9205d
['Bowei Han']
2020-09-07 17:45:21.242000+00:00
['Soft Skills', 'Technology', 'Software Development', 'Front End Development', 'Software Engineering']
281
5 Video Game Sequels We Can’t Wait For Much Longer
5 Video Game Sequels We Can’t Wait For Much Longer I mean, we will…but we really don’t want to As someone whose primary introduction to the gaming world came in the form of the Kingdom Hearts franchise, I’m no stranger to waiting half a lifetime for a sequel to release. Unlike many other branches of the entertainment industry, video game sequels are often just as good as or even better than the original game, making it possible for a very long story to be told over the course of multiple games. I don’t know a single gamer who scoffs at the idea of their favorite solo game being turned into a full-blown series. Unfortunately, good games take time, and we as consumers are not patient, and I’m here to fan the flames of your discontent with five sequels we just can’t wait much longer for — but totally will if you insist. Source: Nintendo. Metroid Prime 4 This one hurts a little. The original Metroid games were classics, with iterations on the NES, Gameboy, and SNES. The end of Metroid when Samus, the armored super-soldier protagonist, is revealed to be a woman, is legendary, and still resonates with me today. The Metroid Prime story arc takes place between the original 1986 NES release and its 1991 Gameboy sequel Metroid II: Return of Samus. Unfortunately, since its original announcement at E3 in 2017, Metroid Prime 4 has been delayed significantly. Nintendo EPD General Manager Shinya Takahashi released an official video in 2019 stating that development was not progressing to their standards, and Retro Studios, who produced the original trilogy, would be taking over. Though this does mean that development is more or less starting from scratch, it also hopefully indicates that, when we do get the finished product, it’s going to be completely worth the wait. Source: Microsoft. Halo Infinite Oh, Halo, you big, beautiful monster of an open-world shooter. There’s almost no argument that the Halo franchise has been one of the most influential in the gaming stratosphere, and it looks like Halo Infinite is going to just build onto that reputation even more. It was initially meant to release as an Xbox Series X launch title at the end of this year, but — well, you know. Delays are the bread and butter of 2020, and Halo Infinite was yet another casualty. Though fans won’t be getting their next adventure with the Master Chief until 2021, the delay will hopefully mean 343 Industries can refine and improve the game to be the Halo masterpiece everyone wants it to be. Despite the development snags, it’s looking like Halo Infinite is going to be a solid entry in the series, and could attract a slew of brand new players for the franchise in the process. Source: Nintendo. Bayonetta 3 North America was blessed with the sexy occult fever dream that is Bayonetta in January of 2010, followed by Bayonetta 2 in September 2014. With both games being released as ports for the Nintendo Switch, an entirely new fanbase was born. That fanbase was also immediately introduced to the agony of waiting for Bayonetta 3, which was announced to be in development as a Switch exclusive at the same time the ports were revealed. It has been almost three years since the announcement and teaser trailer were revealed at The Game Awards 2017, and there is still no official release date. Aside from the occasional one-line statement from Platinum, it’s been radio silence. Until more information comes out, fans are just going to have to be satisfied with trying to kill angels on Non-Stop — Climax mode and beating everyone senseless on Super Smash Bros. Ultimate. Source: Sony. Final Fantasy VII Remake Part 2 Waiting for Square Enix games is at least 75% of my identity in the gaming world. The first video game I ever truly fell in love with was Kingdom Hearts. I’ve played through the first two mainline titles in that series so many times that I can recite several of the pre-boss fight cutscenes by memory. One of my favorite ways to distract myself from waiting for Kingdom Hearts III was to also wait for the Final Fantasy VII remake. Imagine my surprise when I discovered that even after it was released, I was going to have to continue to wait for it. Luckily, all signs point to a short wait, as Square Enix leadership confirmed that Part 2 was already well underway back in November 2019, so though we don’t have an official release date for the follow-up, it doesn’t look like it’s going to be the 14 years that Kingdom Hearts fans had to suffer between Kingdom Hearts II and Kingdom Hearts III. The first part of the remake was stunning, and the plot changes were subtle enough that the story was still recognizable, but not so similar that you know exactly what’s coming next. After seeing just how stunning Midgar looked on the PS4, it’s going to be pretty incredible to see just how gorgeous the wider planet is going to be when it’s running on the PS5. The original game is a classic for a reason, and the re-imagining is likely to follow right in its predecessor’s footsteps. Source: Nintendo. The Legend of Zelda: Breath of the Wild 2 Zelda fans are passionate. With the release of the Switch, they have plenty to celebrate. Not only have they been gifted one of the best games in the series in the original Breath of the Wild, but a sequel was announced at E3 2019. Outside of a brief trailer that came out with the announcement, not much is known about the highly anticipated sequel. We know it’s being developed for the Switch, but that’s about it. It’s extremely likely that it won’t be released until at least 2021, though a recent pre-order listing from a UK retailer listed it as available on December 31, 2020. This is probably just a placeholder, unfortunately. Nintendo has been keeping their plans for the game close to their chest, but with all of the development delays that 2020 has brought, a holiday release for this year is unlikely. Conversely, the Breath of the Wild engine is already fully developed and ready to go, so it’s possible that there was much less work to be done on a sequel from the beginning. With over a minute’s worth of animation being available for the trailer last year, perhaps we’re closer to a finished product than Nintendo is letting on. At any rate, unlike several of these much-anticipated sequels, Zelda fans don’t have to wait for a next-gen console to release as it looks like the Switch is sticking around. Bre is a part-time writer with a day job in the fitness industry. She is based in Orlando, FL along with her handsome fella and two cats. She enjoys writing about social issues, politics, spirituality, mental health, gaming, and, every so often, fiction. She’s still working on her personal website, but you can find her anytime on Twitter or Instagram.
https://medium.com/super-jump/5-video-game-sequels-we-cant-wait-for-much-longer-4389f8fb64fc
['Bre Venanzio']
2020-08-21 12:59:59.007000+00:00
['Technology', 'Gaming', 'Features', 'Culture', 'Videogames']
282
This AI Can Clone Your Voice Just By Listening for 5 Seconds
This AI Can Clone Your Voice Just By Listening for 5 Seconds A discussion on a cutting edge AI used for human voice cloning with minimal data Photo by Owen Beard on Unsplash This post is about some fairly recent improvements in the field of AI-based voice cloning. If we have hours and hours of footage of a particular voice at our disposal then that voice can be cloned using existing methods.But this recent breakthrough enables us to do the same using minuscule data — only five seconds of audio footage. The output generated using this method has timbre strikingly similar to the original voice and it is able to synthesize sounds and consonants that are non-existent in the original audio sample.It is able to construct these sounds on it’s own. You can listen to some generated samples here. Here is how the interface looks like: Image source can be accessed here (Reference[2]) The detailed diagram of the architecture is given below.This method is able to do what it does using the following three components. Description of the architecture used as seen in the paper 1. The Speaker Encoder It is basically a Neural Network trained on thousands of speakers and it squeezes the information learned from the training data into a compressed representation. In other words it learns the essence of human speech from a multitude of speakers. It uses the training audio sample footage to pick up the intricacies of human speech but this training need to be done only once. After that only five seconds of speech is enough to replicate the voice of an unknown speaker. 2. Synthesizer It takes as input whatever we want our synthesized voice to say as text input and returns a Mel Spectrogram. A Mel Spectrogram is a concise representation of one’s voice and intonation. This part of the network is implemented using DeepMind’s Tactocron 2 technique. The diagram below shows an example of Mel Spectrogram for male and female speakers. On the left we have a spectrogram of reference recordings of the voice sample we want to replicate and on the right we specify the piece of text that we want our synthesized voice to say and it’s corresponding synthesized spectrogram. Mel Spectrogram for training and synthesized data as seen in the paper 3. The Neural Vocoder Ultimately to listen to the learned voices we need to output a waveform. This is done by the Neural Vocoder component and it is implemented using DeepMind’s Wavenet Technique. Measuring Similarity and Naturalness Ultimately our goal is to output something that is similar to the voice of the target person but it should say something very different from the input sample in a natural manner. From the table below we can see that swapping the training and the test data drastically changes the naturalness and similarity of the synthesized voices. The detailed section of the paper describes how to work our way around these difficulties. The authors also define a metric called Mean Opinion Score that describes how well a cloned voice sample would pass as authentic human speech. Similarity and Naturalness metrics as seen in the paper Conclusion: This technology has a lot of promise for the future like generating voices of people who have lost theirs due to degenerative diseases.Also it might be used unscrupulously to clone voices of authoritative people and world leaders for wrong reasons.The only out of this would be to have proper techniques to detect whether a voice is natural or synthesized. References: [1]Ye Jia et al-Transfer Learning from Speaker Verification to Multispeaker Text-to-Speech Synthesis(2018),NeurIPS 2018. [2] https://google.github.io/tacotron/publications/speaker_adaptation/ Disclaimer: I was not a part of the project and am merely providing commentary on the topic to the best of my understanding.The full credit behind the research work goes to the authors mentioned in the references and I highly recommend checking out their paper.
https://medium.com/swlh/this-ai-can-clone-your-voice-just-by-listening-for-5-seconds-783885102a8d
['Jorge De Guzman']
2020-08-18 17:02:28.509000+00:00
['Artificial Intelligence', 'Machine Learning', 'Programming', 'Technology', 'Data Science']
283
Blockchain for Social Impact: UN Sustainable Development Goals 2030
Even though we as humans have come a long way from the old days in terms of well being and bettering our living conditions, there is no denying that not everyone gets a similar standard especially when it comes to sustaining it. The United Nations which consist of 193 member states from all over the world got together to create reforms to make the world better. Established in 2000, the Millennium Development Goals (MDGs) had 8 important points targeting international development with the target set at 2015. With most of the objectives met, the UN decided to further advance these points in the 2015 summit. Now known as the Sustainable Development Goals (SDGs) targeting sustainability action at the local level, they include 17 points that all member states have agreed to meet by the year 2030. Keeping this in mind, certain mechanisms have to be adopted that will aid these goals. Technology will be key to fulfil these objectives and Blockchain will be at the forefront of it. The SDG’s were incorporated after getting the results from the MDG’s which were quite successful. – 1 Billion + people lifted out of extreme poverty – Child mortality dropped below half – Out of school kids dropped below half – HIV Aids infections dropped below 40% These facts show that there is a result being obtained through continuous efforts of the UN’s supporting partners but the numbers also show the fact that there is a huge chunk of the population missing out. Even though some of the Goals mentioned above do not relate to human beings, each and every one of them affect us in a way. Hence, finding a solution, matters to all of us. Currently, there are about 7 billion people in the world out of which 2.3 billion people live without an identity. That is almost a quarter of the total population (23%). This is a problem that no one has been able to solve. Governments, NGO’s and others included. With the concept of blockchain and digital identity, this issue might be fixed. Blockchain has made its way into mainstream businesses and is finding solutions in all industries so it is no surprise that it can make a huge impact on the UN SDG’s as well. It is forecasted that by 2030 when the SDG’s are set to be concluded, 10% of global GDP will depend on Blockchain. Some of the key features of the technology that are being heralded as main problems solvers are its transparency, enhanced security and movement of value. Not only will it be able to give out a proper database of data, shared and non-editable records will make it more reliable. This value of trust has been debated a lot in the past decades especially when money is involved. With the inclusion of Blockchain and its implications on Finance, we can say that one more problem will be effectively handled. It is important to note that all the goals are interconnected so finding the solution to one might just be the start that the responsible parties need. The main concern for the UN is controlling the input that different organizations will give to complete these objectives. A chain of hierarchy has been formed with the UN on top, national governments below and then local and subnational governments following them, it is imperative to know that multinational/national businesses and organizations have also started aiding in implementing the UN SDG’s in their respective areas. With every little aid counting towards completing the goals, this is usually met with great appreciation for all stakeholders involved. Most campaigns that are associated with such parties target International Aid, Remittances and healthcare. One thing that is common in all the campaigns is money or the concept of donation. That is where the problem starts. Every year 1 trillion US Dollars goes missing which is equal to 30% of global donations given in charity. With such a massive chunk of donations not able to meet its intended target, the trust in organizations and charities have dwindled causing donors to stop campaigning. Blockchain has found a way to completely eliminate this corruption. By converting all real-world entities into digital assets and enhancing transparency, all the amount given in donation will be accounted for until it reaches its intended recipient. In other words, this is a solution to a 1 trillion Dollar per year problem. One example of the mentioned initiative is Global Charity Recycling. With more and more similar setups being built to aid the completion of the SDG’s, we can definitely say that blockchain will play an important role in the present up till 2030. Some of the points in the Sustainable Development goals are hugely linked with matters involving improper documentation of identity and inequality. Without identification, there will not be enough information on who needs the help. Without equality, there will not be proper distribution of assistance. The aid provided needs both these things and the best way to do this is through automation and digitization. Blockchain will be imperative in improving both fields and giving all parties involved in maintaining the information a huge boost leading towards future operations. It is known that these objectives that were built for the SDGs came from real world problems. Keeping this information in mind, the solutions to these problems are something that requires full commitment from everyone. It is our responsibility to ensure that we do our part since these problems affect us as well. We have to go with the mentality of “No one getting left behind” to target such issues as poverty and hunger. Even though we as individuals cannot do it alone, we can raise our voices to ensure that those who can do hear the message. A Global message promoting the solution towards ending problems such as poverty, equality, healthcare and more. Blockchain can aid in solving these objectives and more. That is what the UN wants. That is what the world needs. That is what we aim to provide.
https://medium.com/@reactivespace/blockchain-for-social-impact-un-sustainable-development-goals-2030-9f37da47bdcf
['Reactive Space']
2020-12-01 13:32:55.205000+00:00
['Climate Action', 'Sdgs', 'Blockchain Technology', 'United Nations', 'Gender Equality']
284
How I learnt to stop worrying and started to love computers … and programming.
Photo by Amelin Orenge on Unsplash When I was in primary school I loved playing with ‘things’ and finding out what made them tick, flash, move or otherwise do. As I got older this fascination shifted to computers and other electronic devices. Once I played with the first BBC Acorn computer and saw house the boxy mouse moved things in the screen and that you can PRINT things, you know, like a real-life book (not like a real-life book at all) I was hooked. Soon my dad had found a cheap ZX Spectrum and a Commodore 64. The Commodore accessed its games and programmes via a small tape player and the Spectrum came with a massive spiral-bound manual that was a bit like a catalogue for games. Except, you had to manually enter the game’s logic line by line, with no errors. Created a bug? Start again. I enjoyed learning Q-BASIC for a while. Before long I’d hit my first bout of frustration, ‘why won’t it work!’. Then came a long our first PC with Windows 3 and 3.1. Games came on the front of a magazine in small diskettes and the printer was now an ink jet, although, the dot matrix was fun because it sounded like a robot. Before long I became the person who knew about computers. But there was always that one other boy who knew more about computers than I did. I could get them to work, but he knew how they worked … you know … inside. This was always my envy. Into secondary school and again, there’s that other boy who knows more about computers. I discovered electronic systems and developed a knack for explaining technical things to people who had never experienced anything technical before. Then along came university and computers started to slip away. Instead, I found new depths to investigate in philosophy, politics, International Relations and other people. I enjoyed learning about all the intricacies and problems of the World and how so many people are trying hard to find solutions to help others at the same time as so many others who are trying hard to find solutions to help themselves. ‘Eventually it seemed like a done deal …’ When I was growing-up I had investigated web development when GeoCities was a thing, TuCows was a search engine and you could AskJeeves or Encarta for answers for your homework. I’d learned how to build and deploy websites by wondering how other people had done their ‘things’, by reverse engineering, making mistakes and by trying different ideas. My parents had encouraged me for a while, then they thought that looking a computer screen wasn’t good for me so would try and put me off unless I could seemingly make a ‘business-case’. I happily created some websites for people and helped others with their tech problems. Eventually it seemed like a done deal that I would just do something with computers as a career and people started to define me by them. Stubbornly, I was put-off. I didn’t like how people thought they knew what I wanted to do and tried to push me down certain routes that didn’t feel comfortable. By the time I had reached university I was prepared to do something different. Time went on, the Internet and the Web developed without me. And I was fine with it. A few years would go by and many life events would happen, some good, some bad, until I found myself looking back at a computer screen typing the familiar characters: <html><title></title><body></body></html>. After looking around the web, I realised so much had changed. So many new technologies, so many new ideas and so many new concepts that I had no clue about. I did well in sciences at school. I had never excelled at maths, unlike some cousins and I had never taken computer science seriously enough to discover it. Now, I wish I had. Mathematics is the door to seeing what’s around us in ways that can unpick the fabric of whatever it is that everything is made of and holds endless wonder. It would also help me develop computational thinking and to put pragmatism to thoughts and concepts and functions that I want to build. ‘I had been worrying for a long time’ I had been worrying for a long time: ‘why do they want me to do this?’, ‘would I get a job good enough in computers?’, ‘how could I be what businesses expect me to be?’, ‘will I ever know as much as the guy who knows more about computers than me?’. Instead of believing in myself, in my experiences and in the directions that I knew I could go in, I listened to the doubt. This doubt would overcome me and push me into places that I didn’t want to be. One day I gave myself some space and gave myself permission to be good at something and I convinced myself that I can do things if I try — sometimes, I just need to try a bit more than I did last time or than other people for various reasons. But I just need to try. If I don’t, I won’t and that isn’t good enough. As soon as I started admitting to others that I am good at computers and that I do know about these kind of things, the guilt started to disappear and I started to become myself again. Now, I may still not know as much as some other people over there who is spinning up full-featured React Redux single page apps but I know enough. I know enough to learn, to help others and to continue my journey to improving myself and perhaps, one day, I’ll build that application or system that will improve upon someone’s situation. I may not work at Google, Facebook, Intel, IBM, Cisco or Microsoft. But, I enjoy doing what I do, pointing others in the right direction with their IT problems, creating web sites for people and learning new technologies. Now, I try to remember. Stop worrying; be good to yourself and others; believe in what you know you can do; try to reach; ignore other people’s doubts for they doubt themselves too much to be happy for you. There is time and there is space, even if there isn’t much money.
https://medium.com/@garethwhitley/how-i-learnt-to-stop-worrying-and-started-to-love-computers-and-programming-d9e1b6cc3cd4
['Gareth Whitley']
2020-12-26 11:04:56.795000+00:00
['Web Development', 'Information Technology', 'Programming']
285
Open Sourcing Mental Illness—A Survey Analysis
“We have to choose to be stronger than fear.” — Ed Finkler Open Sourcing Mental Health (OSMI) began almost a decade ago, when Ed Finkler, a web developer and open source advocate, started speaking at tech conferences about his personal mental health journey. The response to his open and honest discussion of the topic was overwhelming, and thus OSMI was born. In his talk, he presents the audience with mental health statistics (from 2016) reported by the WHO. Mental health conditions are the second leading cause of workplace absenteeism. Just looking at depression, workers lose about 5.6 hours per week due to mental health issues. Workers with no mental health problems lose about 1.5 hours, which is a significant difference in terms of hours lost. Translating this into dollar figures is a somewhat fuzzy exercise, but it is estimated to account for 43.7 billion dollars, corresponding to 200 million days lost due to people not being able to work effectively, or not being able to be in the office (pre-Covid times). It is clear that the consequences that come with mental health disorders are not only personal, but also economical. The OSMI dataset As stated on their website, ‘Open Sourcing Mental Illness (OSMI) is a non-profit corporation dedicated to raising awareness, educating, and providing resources to support mental wellness in the tech and open source communities. Every year they run a survey, aimed to measure attitudes towards mental health, and to analyse the frequency of mental health disorders among tech workers. Some of the older datasets contain over a thousand contributors, however I chose the 2019 survey with just over 350 responses, mainly because I was interested in understanding the current status of mental health conditions in the tech workplace. In this blog I present a simple app that allows the user to interrogate the survey data from 2019. The app displays questions that I found interesting, however the full survey is even more comprehensive. The dataset was downloaded from Kaggle, and the raw data can also be found on the OSMI website. The app was created using plotly dash, and deployed with heroku. The code can be found on my github. The app can be accessed at the following link, for anyone interested in looking at the survey results: https://tech-mental-health-app.herokuapp.com OSMI Survey Results I divided the questions in 5 different categories, just to make the interrogation and interpretation of the data easier. The categories are: Demographics: the country, gender, race, etc. of responders Mental health in the workplace: focusing on employer benefits and openness when it comes to discussing mental health with coworkers and employers Employee mental health: most common mental health issues, diagnosis and the effect of mental health issues on productivity Reactions: the occurrence of supportive and badly handled responses to mental health conditions, expectations in terms of discussing mental health at job interviews Long responses: free text entered by the respondents relating to some of the above questions. The survey and OSMI was born in the USA, thus it is not surprising that close to 70% of the respondents also work there. The next 3 largest groups participating in the survey are from the United Kingdom (10%), Portugal (6%) and Brazil (close to 6%). The majority of the respondents (about 60%) belong to the 25 to 40 year old age bracket. In terms of demographics, about 64% of the respondents are male, and close to 30% are female. The Women in Tech Statistics For 2020 concluded that the largest tech companies on the planet (Amazon, Apple, Facebook, Google and Microsoft) have a similar female representation of 34.4%, in agreement with the OSMI survey demographics. When it comes to employee mental health, 42% report to have a mental health issue, and an additional 21% say they possibly have one. That adds up to more than 60%, and still, this is just a lower limit given that some report they “don’t know” whether or not they have a mental health disorder. The most commonly reported diagnoses are: Mood Disorder (Depression, Bipolar Disorder), Anxiety Disorder (Generalised, Social, Phobia, etc), Attention Deficit Hyperactivity Disorder (ADHD), and the combination of the above.
https://medium.com/@katinka-gereb/open-sourcing-mental-illness-a-survey-analysis-93341c27035d
['Katinka Gereb']
2020-11-21 22:22:44.238000+00:00
['Surveys', 'Heroku', 'Data Visualization', 'Technology', 'Mental Health']
286
Beijing Witnessed the First Edition of the Global High-Throughput Public Blockchain Technology Summit on August 18th
Beijing Witnessed the First Edition of the Global High-Throughput Public Blockchain Technology Summit on August 18th InterValue Follow Aug 22, 2018 · 6 min read Hosted by the InterValue’s Foundation, the summit was successfully held in Beijing on August 18th. Yuan Yong, Director of the Blockchain Committee of China Automation Society; Li Daxue, founder of Magnetic Cloud Technology; Barton Chao, founder of InterValue; Wang Binsheng, Distinguished Professor of Graduate School of Chinese Academy of Social Sciences; Zhang Hongxin, Director of Institute of Digital Assets and Blockchain of Zhejiang University; Jiang Xinwen, Professor at the School of Computer Science of National Defense Science and Technology University; Li Wei, the founder of BodyOne; Wu Pengsong, co-founder of 2FChain; Li bin, founder of Rootscap Capital; Li Gaoqiang, co-founder of Ben Core Capital; Peng Fei, product director of HashStor. They shared valuable insights with the audience. Professor Yuan Yong addressed three main points on the topic of blockchain and artificial intelligence at the summit forum. • What results can be produced by using artificial intelligence combined with blockchain? The rise of distributed artificial intelligence • How artificial intelligence is structured on the blockchain: parallel architecture • How artificial intelligence and blockchain change the existing world: non-disruptive, natural evolution The most important characteristic of blockchain technology is its “decentralized autonomy”. Its P2P network, distributed consensus mechanism and contribution-based economic incentives are natural aspects, which make it a distributed autonomous social system. Each node in the blockchain system will act as an autonomous agent in the distributed system. Blockchain provides a reliable, usable, and efficient data foundation for artificial intelligence. Building a house on the sand doesn’t require a strong foundation, but when it comes to building a high-rise building, better do it on a reinforced concrete foundation. The same goes for blockchain as a foundation for artificial intelligence systems. Wang Binsheng, Professor at the Graduate School of the Chinese Academy of Social Sciences, received a warm enthusiasm from the audience. He pointed out that the economic essence of blockchain popularization lies in the transfer of assets. Traditional economic behaviors, including currency issuance, will be digitalized. The traditional organizations with central authorities and intermediaries will gradually disappear. Tokens will be the value distribution tools in the era of collaboration through the combination of artificial intelligence and the Internet of Things. Besides, fiat currencies will gradually disappear from the circulation in the future. The popularization of blockchain technology solves the problem of centralization and wealth concentration that has existed since the beginning of the human industrial era. In the era of the multi-token economy, the value of life will be redefined. So will be the concept of wealth, as part of a natural process. Barton Chao, the founder of InterValue, officially announced during the summit the results of InterValue’s TestNet 2.0. · Pure performance test: 10 shards≥2,400,000 TPS · Actual transaction performance test: 10 shards ≥ 200,000 TPS · Different geographical distribution test: 10shards ≥ 420,000 TPS · Performance test after signature verification: 10 shards≥ 100,000 TPS · With signature verification and erroneous data performance test: erroneous data has no impact on transactions, and system fault tolerance meets the expected results. Barton Chao also gave his interpretation of the current development trend of the blockchain industry: 1) The bubble will gradually pop-off. 2) Projects with tangible achievements and hard work will succeed. 3) Progress is quick, which may accelerate the end of the current crisis. 4) Real blockchain applications are expected to occur within one year. 5) Company and Community will gradually become indistinguishable. BodyOne’s CEO Li Wei announced at the summit that “We should see more from the application point of view. His company launched the first fitness mining machine. How to combine this with other blockchain use cases? What is the incentive for fitness practitioners to participate in the Token economy?” He gave an in-depth interview and offered 2 fitness bike mining machines to win through a lottery. HashStor, the most cost-effective professional IPFS mining machine in the market. Peng Fei , Product Director of Hashtor, introduced HashStor in details with the theme of “Permanent Storage forever”. He brought five units of the “H1 Family Economic Version” mining machines, which were donated to five lucky guests on the spot. At the same time, HashStor’s pre-sales event officially kicked off. These five guests will become the first users to experience this item. HashStor will also serve as a premium blockchain solution provider, leveraging blockchain technology to drive its own growth and provide quality service to its customers. InterValue has signed strategic agreements with 15 partners including ICC, BodyOne, HashStor, DGames, Magnetic Cloud Technology, Origin Capital, Pentax Capital, Xidian, Beijing Institute of Technology, 2Fchain, Coin, New Start, CSDN, Coin, and Coin World. The strong alliances aim to jointly build a new ecosystem in the blockchain industry! Barton Chao, founder and CEO of InterValue, gave an award to partners who signed a strategic partnership agreement. Ran Lizhi, the founder of Rootscap Capital, an investment fund, made a detailed analysis of the market in the past year and pointed out that there is also a positive side. • Token- and blockchain-proponents, who have been despising each other, will eventually reach consensus • The market may be flooded with worthless tokens , but blockchain applications really start to appear • Pure technology concept -a case of the economic restructuring of production relations •Currency change, chain change, ticket change, and empowerment of the real economy. Three major phenomena are not impacted by bear market: 1) Changes in the financial environment 2) Development and breakthrough of public blockchains 3) Generalized impact on the economy. Bruce Li, the co-founder of Benrui Capital, made a detailed analysis of the status quo and prospects of blockchain technology. He talked about the nature of the blockchain as a tool for large-scale collaboration in a fully open environment and P2P network conditions. Blockchain is a type of technology that enables everyone in the world to send transactions and get validation. Consensus can be reached in the shortest possible time. The blockchain infrastructure provides the most efficient type of transmission environment and storage space for a variety of ecosystems. This forum provided a high-level communication platform for the blockchain industry and offered very good opportunities for investment institutions, blockchain application companies, technology communities, and media to get together and participate in the advent of the new blockchain 4.0. era. Please visit the website for more details: http://818.inve.one
https://medium.com/intervalue/beijing-witnessed-the-first-edition-of-the-global-high-throughput-public-blockchain-technology-4646605bd80c
[]
2018-08-30 03:04:35.483000+00:00
['Technology', 'Blockchain', 'Inve', 'Intervalue', 'Bitcoin']
287
How Does COVID Tracing work?
A couple of weeks ago Canada released the COVID Alert app to the general public. The app is intended to help tracing efforts in the province to curb the spread of COVID-19. Understandably, there are some questions regarding its privacy and security implications. Although I focus here on the Canadian version of the app, the underlying architecture is used by a lot of other public health authorities all over the world. You can find the full list of countries here. The first important thing about COVID Alert is that although the app itself is being branded and released by Health Canada (and other public health authorities) much of the core technology is provided by Apple and Google for iOS and Android respectively. Earlier this year, Apple and Google, in a joint effort, released a framework called the Exposure Notification System (ENS) designed specifically for building COVID tracing apps. The framework provides a set of tools for developers on both platforms and is intended to be used by public health authorities all over the world. It has been created with the explicit goal of preserving user privacy by design in an effort to encourage broader adoption. More on that here. As of now, the system is 100% opt-in. You don’t have to download the apps if you don’t feel comfortable. Nor would you be forced to opt-in at a later stage. However, the success of the program does depend on having a certain percentage of the population signed up. The mechanism itself is fairly straightforward and uses a decentralized approach. Let’s say we are living in an ideal scenario where a majority of people have the app on their phone. Once you install it, it runs in the background and sends out randomized “codes” every 10–15 minutes via Bluetooth. These codes are essentially just gibberish and not tied to you in any way. Over the course of the day, you would generate dozens of these codes. For simplicity, let’s say that your code never changes and is fixed to something ridiculously simple like 12345 . Phones that are near you, within a distance of 10m, and also have the app installed, are listening for these codes. If you’re hanging out with a friend, or at a busy supermarket, your phone will be exchanging codes with those around you. Your phone will store the codes that it “came in contact with”. And likewise, all the phones around will you store your code. At this point, all data is decentralized, meaning that the information is stored purely on phones and not on a server somewhere. Now, hypothetically, one of the people you ran into at the supermarket later tests positive. The person can then tell the app that they’ve been infected. In most cases, they would be required to submit some sort of medical proof or confirmation from a public health authority. In Germany, for example, you would scan a QR code issued by the health authority confirming your diagnosis. This is to prevent people from falsely claiming that they’ve been infected to intentionally create false alarms. The app would then upload all of the codes issued to that person over the last 14 days to a web server. Or in our simplistic model, the single code that they were assigned, say 56789 . The server is only storing a master list of these “infected” codes. As the final step, your app downloads a list of these infected codes from the server a few times a day and compares them against all the codes of other people that you have run into. If it finds a match, it means that you’ve been in contact with an infected person and you’re at a higher risk and are notified immediately. This would then likely translate into a higher testing priority for you. And likewise, if you test positive, then the people that came into contact with you, and so on. The “privacy-protecting” approach comes from the fact that the system does not theoretically need access to your location or GPS at any point. If your location is needed for any reason, you will be prompted for permission. Nor does it need access to any sort of personal information. The only thing the system really needs is a Bluetooth based device. Additionally, your codes never leave your phone unless you get infected, meaning that nobody gets access to your data, including the health authorities. Whether or not this technology is too far reaching, or not extensive enough, is a whole other debate. But at the very least, I hope that having a better understanding of the underlying tech can help folks make an informed decision about whether or not they’d like to opt in.
https://medium.com/bits-n-pieces/how-does-covid-tracing-work-121ff9f07c8c
['Sanchit Gera']
2020-09-18 15:13:29.668000+00:00
['Technology', 'Software Development', 'Covid 19', 'Health', 'Programming']
288
Explore Your Activity on Youtube With R: How to Analyze and Visualize Your Personal Data History
Explore Your Activity on Youtube With R: How to Analyze and Visualize Your Personal Data History Find out how you consume Youtube using a copy of your personal data Analysis and visualization of your data on YouTube — Plot based on YouTube categories Let’s face it, dear reader, one of the websites that we visit frequently and became part of our life, whether for entertainment or looking for answers, and where we could spend hours, is YouTube. As with anything else we spend so much time on, I find it curious to see exactly how our consumption habits have changed. With Google Takeout and the YouTube API, analyzing your personal YouTube history is easy. How can I get a copy of my data? Your YouTube history is available through Google Takeout, a tool that Google provides to consult practically the history and stored data of any of the products that you have used. You have to enter the URL https://takeout.google.com/settings/takeout and log in with your personal account. Screenshot: Google Takeout website There you will find the possibility to request a complete copy, or by-product, or even select only some characteristics of a product. We are only interested in YouTube for now. Even if you have multiple accounts linked, you can switch between them by clicking on the profile picture in the upper right. If you have uploaded videos, I recommend that you make sure to uncheck the videos option of the file that it will generate, this will decrease its weight and delivery time. Screenshot: Generating a copy of YouTube account data Once the file is generated, you will be receiving a notification email, from where you can proceed to download it. For security, the file has an expiration date, so you have to do it before the date indicated in the email. Screenshot: Data copy download notification email from Google Takeout When you download the file you will get a .zip that when unzipped will create a structure of folders and files, depending on what you initially requested from Google Takeout. Screenshot: Folder structure of the unzipped .zip Currently, the generated file corresponding to your search and watching history on YouTube is only exported as HTML. You can use the “rvest” package to parse the generated HTML. Yes, I know, don’t judge me, “el canaca” and “dios eolo” will always be classics (viral memes in Mexico). Screenshot: YouTube Search History Working the exported data Now you can proceed to create an R script. First, you have to include all the necessary packages and set the search history reading, contained in the “search-history.html” file located inside the “history” folder. # REQUIRED PACKAGES library(stringr) library(rvest) library(tidyverse) library(jsonlite) library(tidytext) library(lubridate) library(wordcloud) library(httr) library(ggplot2) library(wordcloud2) library(RCurl) library(curl) library(pbapply) library(ggthemes) # READ SEARCH HISTORY youtubeSearchHistory <- read_html("Takeout/YouTube and YouTube Music/history/search-history.html") To parse text in HTML using “rvest” you can specify CSS classes that correspond to a particular section. For example, the line .header-cell + .content-cell> a finds the hyperlink in the content, corresponding to the searches you made. # SCRAPING SEARCH HISTORY youtubeSearch <- youtubeSearchHistory %>% html_nodes(".header-cell + .content-cell > a") %>% html_text() Screenshot: YouTube Search History HTML Element Inspector And of course, you can get the information you need, for example, the timestamp in the same way. # SCRAPING TIMESTAMP youtubeSearchContent <- youtubeSearchHistory %>% html_nodes(".header-cell + .content-cell") youtubeSearchTimeStr <- str_match(youtubeSearchContent, "<br>(.*?)</div>")[,2] youtubeSearchTime <- mdy_hms(youtubeSearchTimeStr) Now that you have the search and timestamp data, you can create a Data Frame with them. # CREATING DATA FRAME SEARCH + TIMESTAMP youtubeSearchDataFrame <- data.frame(search = youtubeSearch, time = youtubeSearchTime, stringsAsFactors = FALSE) Watching history The watching history you will find in another HTML with the name of “watch-history.html” is also contained within the “history” folder. First, you must read the file and obtain, in the same way as before by web scraping, each entry. You will have to analyze information with regular expressions. # READ WATCH HISTORY watchHistory <- read_html("Takeout/YouTube and YouTube Music/history/watch-history.html") watchedVideoContent <- watchHistory %>% html_nodes(".header-cell + .content-cell") # POSSIBLE TIME CHARACTERS watchVideoTimes <- str_match(watchedVideoContent, "<br>([A-Z].*)</div>")[,2] # POSSIBLE ID VALUES watchedVideoIDs <- str_match(watchedVideoContent, "watch\\?v=([a-zA-Z0-9-_]*)")[,2] # VIDEO TITLE watchedVideoTitles <- str_match(watchedVideoContent, "watch\\?v=[a-zA-Z0-9-_]*\">(.*?)</a>")[,2] Now you can put everything back together in a Data Frame. # DATA FRAME WATCH HISTORY watchedVideosDataFrame <- data.frame(id = watchedVideoIDs, scrapedTitle = watchedVideoTitles, scrapedTime = watchVideoTimes, stringsAsFactors = FALSE) watchedVideosDataFrame$time <- mdy_hms(watchedVideosDataFrame$scrapedTime) Get more data about videos with the YouTube API I’m almost sure you’ll say up to this point… Well Saúl, it’s not a big deal, what else! This gets more interesting when you integrate with the YouTube API to get more data. Instead of just seeing basic data like titles and timestamps, you can get more information. In this way, you can see the popularity, descriptions, categories, and more about your video views. If this is your first time obtaining credentials to use the YouTube API, follow the simple steps that the official YouTube documentation makes available to you: https://developers.google.com/youtube/v3/getting-started. Once you establish a Google project, you will need to generate an API Key and enable YouTube Data API v3, to create your credentials in the sidebar menu. Screenshot: Google API Console, setting the YouTube API Screenshot: Google API Console, generating a new API Key Once this is done, you can now use your API Key by assigning it to a new variable. You must also assign the connection to the youtube API to a new variable. youtubeAPIKey <- "HERE_YOUR_API_KEY" connectionURL <- ' # ESTABLISH API KEY AND CONNECTIONyoutubeAPIKey https://www.googleapis.com/youtube/v3/videos' You can make a test, for example, I will take a video from my YouTube channel with ID “SG2pDkdu5kE”, to give you an overview. # TRYIING QUERY RESPONSE videoID <- "SG2pDkdu5kE" queryParams <- list() queryResponse <- GET(connectionURL, query = list( key = youtubeAPIKey, id = videoID, fields = "items(id,snippet(channelId,title,categoryId))", part = "snippet" )) parsedData <- content(queryResponse, "parsed") str(parsedData) You will find that the two important parameters are “fields” and “parts”. You must be careful in your queries because you could exceed the request quota or the response to the request may become very slow. You can find more information about these parameters in the official documentation: https://developers.google.com/youtube/v3/docs/videos Screenshot: Testing Query Response Get Video Category: Preparing Requests To get more metadata like the video category, you can use the video IDs you got from your file and make requests to YouTube for more information about each video. You are about to make several thousand requests to the YouTube API, so you have to do additional preparation of the requests. The most popular library for making web requests is “httr” (it only supports one request at a time). There exists also “RCurl” and “curl”. To make sure you get the data as quickly as possible, try all three, speed may differ based on established queries. testConnection <- " testCount <- 100 # REQUESTS OPTIONStestConnection https://www.google.com/ testCount # HTTR TEST system.time(for(i in 1:testCount){ result <- GET(testConnection) }) # RCURL Test uris = rep(testConnection, testCount) system.time(txt <- getURIAsynchronous(uris)) # CURL TEST pool <- new_pool() for(i in 1:testCount){curl_fetch_multi(testConnection)} system.time(out <- multi_run(pool = pool)) At least in my case, the speed was much faster with “curl”, unlike the other two options. Get Video Category: Formatting Requests You will need to create a connection string for each request and eliminate duplicates to reduce the number of requests made. You will also need a function to parse the request-response data. # CREATE REQUEST AND REMOVE DUPLICATES createRequest <- function(id){ paste0(connectionURL, "?key=",youtubeAPIKey, "&id=",id, "&fields=","items(id,snippet(channelId,title,description,categoryId))", "&part=","snippet") } uniqueWatchedVideoIDs <- unique(watchedVideosDataFrame$id) requests <- pblapply(uniqueWatchedVideoIDs, createRequest ) # PARSE OUT RESPONSE getMetadataDataFrame <- function(response){ rawchar <- rawToChar(response$content) parsedData <- fromJSON(rawchar) data.frame <- cbind(id = parsedData$items$id, parsedData$items$snippet) return(data.frame) } You can configure what to do in case the request is successful or in case it has failed. videoMetadataDataFrame <- data.frame(id = c(), channelId = c(), title = c(), description = c(), categoryId = c() ) # SUCCESS addToMetadataDataFrame <- function(response){ .GlobalEnv$videoMetadataDataFrame <- rbind(.GlobalEnv$videoMetadataDataFrame,getMetadataDataFrame(response)) } # FAIL failFunction <- function(request){ print("fail") } A method, although slower but more reliable to obtain your data, is to take it from memory and configure our multiple requests. # GRAB REQUEST RESPONSE FROM MEMORY fetchMetadataFromMemory <- function(request){ return(getMetadataDataFrame(curl_fetch_memory(request))) } system.time(out <- multi_run(pool = pool)) saveRDS(videoMetadataDataFrame, file = "videoMetadataDataframeAsync1.rds") length(requests) nrow(videoMetadataDataFrame) listMetadata <- pblapply(requests, fetchMetadataFromMemory) Once this is done, depending on the speed of your internet connection, and the size of your data frame, you can probably prepare a coffee, a tea, or open a beer, waiting as long as the indicator reaches 100%. Screenshot: listMetadata console preview You will also use the “bind_rows” function to combine the list into an ordered Data Frame. # COMBINE LIST INTO A DATA FRAME videoMetadataDataFrame <- bind_rows(listMetadata) saveRDS(videoMetadataDataFrame, file = "videoMetadataDataFrame_memory.rds") Get Video Category: Formatting Categories Each category has a unique ID. You need to make another request to get them and add the data to a new column. categoryListURL <- " # CATEGORY ID REQUESTcategoryListURL https://www.googleapis.com/youtube/v3/videoCategories categoryResponse <- GET(url = categoryListURL, query = list( key = youtubeAPIKey, regionCode = "us", part = "snippet" )) parsedCategoryResponse <- content(categoryResponse, "parsed") categoryDataFrame <- data.frame(categoryId=c(), category=c()) for(item in parsedCategoryResponse$items){ categoryDataFrame <<-rbind(categoryDataFrame, data.frame(categoryId = item$id, category=item$snippet$title)) } categoryDataFrame videoMetadata <- merge(x = videoMetadataDataFrame, y = categoryDataFrame, by = "categoryId") head(videoMetadata) You can combine your new Data Frame with your watch history to get the video metadata along with when it was played. # COMBINE WITH WATCH HISTORY watchedVideos <- merge(watchedVideosDataFrame , videoMetadata, by="id") str(watchedVideos) Have your tastes of what you watch on YouTube changed over time? With your search history, watch history, and category metadata, you can now answer this question. You can see which video categories you play the most and how this has been changing over time. # VISUALIZE VIDEO CATEGORIES WATCHED watchedVideos %>% group_by(category) %>% summarise(count = n()) %>% arrange(desc(count)) watchedVideos %>% ggplot(aes(x = time, fill = category)) + labs(x= "Year", y= "Count") + ggtitle("How much have your genre tastes changed over time?", "Most played categories")+ geom_area(stat = "bin") + theme_economist_white() For example, in my case, something that surely likes many of you is very present, are videos in the music category. And of course, the presence of the Film & Animation category makes sense, I follow many channels to find out about upcoming movies, trailers, and reviews.
https://towardsdatascience.com/explore-your-activity-on-youtube-with-r-how-to-analyze-and-visualize-your-personal-data-history-b171aca632bc
['Saúl Buentello']
2020-09-14 21:25:37.964000+00:00
['Data Analysis', 'Programming', 'Data Visualization', 'Data Science', 'Technology']
289
Cities in the Information Age
The Neighborhood City // 2 Cities are adopting technology to improve public services, but where do we go from here? “I would expect that next year, people will share twice as much information as they share this year, and next year, they will be sharing twice as much as they did the year before.” — Mark Zuckerberg, founder and CEO of Facebook, speaking at web-summit in 2008 Whether or not you subscribe to Mark Zuckerberg’s predictions about an exponential increase in the amount of information shared, there is no denying that our daily lives are more tied to digital information technologies than ever before. In the real estate industry, more than ninety percent of all homebuyers, renters, and even commercial tenants begin their search for new space using web-based resources. The way we share information about real estate has a major impact on the city planning and development process. Local market research organizes data to understand the supply and demand for products, goods, and services in a local area. This research is often part of a real estate market analysis, and it ties all of the different groups involved to a common need; a need to know ‘where’. Location-Based Information The proliferation of information about place-based elements, such as individual property details, infrastructure, land use, culture, government policies, commercial activities, and socio-economic aspects of an area is leading to better data-driven decision making about specific locations. A foundational breakthrough in location-based information came in the latter half of the 20th century, with the advent of geospatial data management and map-making technology called a Geographic Information System (GIS). During the 1990’s and 2000’s municipalities of all sizes, as well as other public and private organizations, adopted commercial software called ArcView. ArcView is developed by the technology company ESRI. Local authorities still rely heavily on the desktop platform’s services as a tool to manage location-based information about real estate, city services, and infrastructure. In addition, new companies such as CARTO, MapBox, MapD, and others, are emerging and improving how mapmakers, consultants, data scientists and government staff approach location-based information. These new companies strive to create more accessible and open source tools for analyzing geospatial data. “CARTO and Mapbox are the new GIS tech stack. We’re both betting on open data, open source code, and a race to the top for flexibility and functionality.” — Javier de la Torre, Founding CEO of CARTO The ways city builders and city operators analyze and share information about places in cities are improving as a result of new tools like CARTO and Mapbox. However, these are still inherently technical tools that not all people who build and operate cities are able to make use of. In addition to adoption issues, professionals also need access to accurate data in order for these mapping tools to be useful, and many cities struggle to provide access to high-quality data as a public service. Open Data These struggles have not gone unnoticed. There are technology companies focused on improving access to government data, as well as other datasets. Such companies focus on providing “Open Data” portals, or websites designed to make data from government-controlled sources, among others, more accessible and publicly available. A 2013 report on Open Data by the McKinsey Global Institute states, “Open data — public information and shared data from private sources — can help create $3 trillion a year of value in seven areas of the global economy.” City governments can maintain thousands of datasets, depending on the size of the city. New York City, for example, maintains over 1,600 datasets on the city’s portal as of December 2016. When government datasets are made publicly available online, software engineers and data scientists are able to create useful tools and informative visual aids with the raw data. The interactions that users have with tools built using open data creates new data as a result of user activities. Many cities now realize the added value of adopting open data policies and maintaining open data portals. As society becomes more reliant on digital interactions, automated services, and cloud-based information technology, cities will continue to create huge amounts of data through the daily operations for government services. Whether they decide to make them public or not may depend on political agendas. Either way, the proliferation of publicly accessible government data sources and better tools to analyze and share the data adds to better information available for the city building process. Smart Cities Driving the creation of more government-controlled data at the local level is the Smart Cities movement and the use of the Internet of Things (IoT) technologies that support this movement. Physical sensors collecting data on everything from pedestrian and vehicle traffic movements to utility usage rates are influencing how we track the economic and environmental performance of places in urban areas. This insight enables cities to adjust operations accordingly, adding or removing capacity in specific areas. The goal of a “smart city” strategy is to optimize how the city performs. In other words, the goal is to improve our abilities to monitor and control urban systems in order to ultimately provide the best quality of life for citizens in the most efficient and sustainable way possible. Today, these trends in Open Data and connected Smart Cities devices are perhaps the most relatable trends in the city building process to “Zuckerberg’s Law of Information Sharing”. It isn’t necessarily humans sharing more information each year in this case, but rather all the devices and public services humans are interacting with during their urban lives that are creating more information year over year. And going forward, as cities continue to adopt technology and share more data publicly, this trend is only expected to accelerate. Technology is enabling cities and local communities to collect new data and information about specific locations. The way we interact with the information and share it is improving. As these trends progress, cities will continue to improve access to data about real estate, public spaces, mobility, and other civic assets and services. Society’s adoption of new technologies like GIS, Open Data portals, and IoT infrastructure supports a better, more open information exchange ecosystem. And it’s at the neighborhood community scale that this new emerging information ecosystem is adding significant value to how people live in cities today.
https://medium.com/citiesense/the-neighborhood-city-2-3d7307e6ca6e
['Star Childs']
2020-10-01 22:38:23.831000+00:00
['Smart Cities', 'Information Technology', 'IoT', 'Open Data']
290
Dr. Dabber Boost EVO
Dr. Dabber Boost EVO Image provided by author. I admit it… I’m a tech geek. If there’s a shiny new device out there, I want to try it — especially in the cannabis space. In an emerging industry, everyone is racing to build a better mousetrap, which means that the market is being inundated with new consumption technology. When I worked in mobile technology, I would joke that I enjoyed the industry because it worked well with my lack of focus. The technology was evolving and adapting quickly; new tech became outdated just months after it was released. I never had the opportunity to get bored because the technology changed so fast; my job was never the same for more than a few months at a time. The same can be said for the cannabis industry. Between technology and science, research and development, compliance, and legislation — nothing stays the same for long in cannabis. There are plenty of tech manufacturers out there that are jockeying for position in a highly competitive atmosphere. Dr. Dabber is one such company that has emerged in the last several years to improve the cannabis consumption experience. Recently, I had the opportunity to try their newest, portable e-rig — The Boost EVO. Super Simple Design One of the most significant benefits of the new Boost EVO is its simplicity. Comprised of only four pieces, the base, the atomizer, an adapter ring, and the glass piece, so there’s not much that can go wrong. The base is sturdy, heavy-duty, and just the right shape and size to comfortably fit in any hand, male or female. With only one control button, the functionality is incredibly simple but more robust than you’d think. Plus, the ambient lights provide a fun, social conversation starter. With five different settings, you can set the right ambiance for every occasion. The quartz atomizer design is impressive, too. A simple magnetic connector ensures the atomizer drops into place without a fuss. The bucket is large enough to hold up to a half gram or more of concentrates; however, unless you’re in a social setting, I wouldn’t recommend using more than you intend to consume in a single session as the heating process will kill any flavors. The glass piece connects to the rig with a surprisingly strong, quick connect, magnetic adapter. This simple ring provides a silicone seal for the glass piece to sit in and the airflow control button. The button allows the consumer to completely block airflow for bigger vapor clouds or allow more airflow for cooler inhalation. Pressing the button allows full airflow to clear out the chamber. Image provided by author. Heavy-Hitting Little Dabber So, size is deceptive — because, despite its small form-factor, the Boost EVO packs a potent punch. The six different temperature settings start with a low temp of 500°F and cycles up to 750°F in 50° increments. At 500 degrees, even the lowest setting provides a pretty significant dab. Additionally, the device holds the desired temperature for 20 seconds in a typical session, which allows for 2–3 puffs on from a single session — perfect for sharing with your spouse. If you’re in a social setting, you can change the length of time the Boost EVO holds the temperature. Changing it to ‘Party Mode’ the e-rig will hold the programmed temperature for 40 seconds, allowing plenty of time to pass it between friends. (Naturally — you’ll want to use a little extra oil if you’re sharing it.) Finally, to hit and hold these high temperatures, the battery must be powerful. Not only that, but this battery is estimated to work for up to 60 sessions on a single charge. The Superman of batteries! Cleaning is a Breeze! Maybe one of the biggest headaches of electronic cannabis devices is cleaning and maintenance. Coils burn out, get gunked-up, and need regular attention. With the Boost EVO, I was impressed with how easy the unit was to clean. Because the quartz atomizer is so simple, it only takes a cotton swab after each use to keep it dab-ready. However, Dr. Dabber suggests replacing the atomizer every 4–12 months, so plan on an additional $50 part replacement regularly. The glass piece cleans like any other glass piece — a little isopropyl alcohol, and it’s as good as new. Wait — I didn’t mention the case! The Boost EVO comes packaged in a unique carrying case made of high-grade, automotive Styrofoam that is virtually indestructible. So, if you’re a traveling dabber and need a device that travels well, then I recommend taking a look at the Boost EVO. Image provided by author. A Few Drawbacks Despite being a super cool little device, here are a few things I would’ve changed: 1. The Temperature Settings: I prefer lower temp dabs, so I would prefer the range of temperatures to come down a little — like 400 to 650 or so. Hot dabs can be painful and hard on your lungs and esophagus. 2. The Glass Design: There’s nothing in place to prevent spills from the glass piece, so be cautious when removing the adapter to reload the device. (I had this thing for a week and spilled it at least three times!) 3. The Glass Shape: Although the glass piece allows for a little water filtration and cooling, the length of the glass piece is very short, so the vapor doesn’t have much time to cool down before it hits your lips/lungs. If it were just a bit longer, it might improve the smoothness of the hit. 4. The Cost: The Boost EVO starts at $329 from To the Cloud Vapor, which is a little on the high-end. Factor in the $50 atomizer replacements, and this device is an investment. My Final Word If you have the money to spend, this is an excellent battery-operated, portable e-rig. I think it’s a little more directed towards the recreational user with the higher temps and flashing party lights, but overall the device felt like the high-end, quality device it’s supposed to be. The higher temps typically mean a more potent, more sedative, couch-locking effect, but for seasoned cannabis consumers, this is often the goal. *Disclaimer: This article contains an affiliate link that pays me a small commission should you decide to buy the device. I appreciate your support!
https://medium.com/seed-stem/dr-dabber-boost-evo-e7e4ba350e28
['Kristina Etter']
2020-12-17 16:01:03.755000+00:00
['Cannabis', 'Marijuana', 'Cannabis Technology', 'Review']
291
Bagaimana Kita Membentuk Masa Depan (?)
My one and only standard of morality is individual liberty.
https://medium.com/@dzkaltair/bagaimana-kita-membentuk-masa-depan-df63cb1a668e
['El Faizziyan']
2020-12-26 18:12:55.642000+00:00
['Universe', 'Future Technology', 'Physics', 'Science']
292
Crafting Software With Care
Testing Isn’t Optional Want to hear my two cents on pushing untested code? Don’t do it. It physically aches me to approve a PR without tests. In my experience, I’ve seen developers with a decade of experience do it, too. Don’t lower your standards and allow bad practices to creep in. Imagine you were asked to just write a few features initially and you thought the project was never going to come back, so you delivered a product without tests. Unfortunately, one day, the project did come back. Now, you have no documentation because it was done in a hurry (happens almost all the time) and can’t rely on your memory to remember what the requirements were, so basically you’re in soup. Tests are the single most effective way of documenting business requirements. Ideally, your code should be ironclad with tests, including tests that verify the business logic, integration with APIs and persistence layers, and UI/acceptance tests. You can’t always cover 100% of the codebase with, tests but the least you can do is make sure you add unit tests to cover most of the business logic. When code changes or tests break, you’ll know what changed and be able to rewrite the tests according to new requirements. Test-driven development is how you can be certain your code is doing what you were asked to do. It can be treated as the official documentation of the codebase. This is also how you save yourself from trouble during long-term maintenance. If your code has no tests. Get it to a state in which you can write some tests for important logic.
https://medium.com/better-programming/crafting-software-with-care-7fde33c85ab3
[]
2020-07-28 16:40:49.387000+00:00
['Software Development', 'Coding', 'Startup', 'Technology', 'Programming']
293
The Next Generation Of Artificial Intelligence
Recommended For You: Although supervised learning, from autonomous vehicles to voice assistants, has driven tremendous progress in AI over the past decade, it has significant limitations. The method of marking thousands or millions of data points manually can be incredibly costly and cumbersome. A big bottleneck in AI has been the fact that humans have to mark data by hand before machine learning models can consume it. Unplash Supervised learning reflects a small and circumscribed type of learning at a deeper stage. Instead of being able to explore and absorb all the latest data, relationships, and implications in a given dataset, supervised algorithms focus only on the concepts and categories identified in advance by researchers. In comparison, unsupervised learning is an AI method in which algorithms learn from data without labels or guidance provided by humans. In artificial intelligence, many AI leaders see unsupervised learning as the next great frontier. “UC Berkeley Professor Jitendra Malik put it even more colorfully:” Labels are the heroin of the machine learning researcher, “in the words of AI legend Yann LeCun:” The next AI revolution will not be overseen. Unsupervised learning more closely mirrors the way people think about the world: without the need for “training wheels” in supervised learning, through open-ended experimentation and inference. One of its fundamental benefits is that much more unlabeled data will still be available in the world than branded data (and the former is much easier to come by). In the words of LeCun, who prefers the closely related term “self-supervised learning”: “A portion of the input is used as a supervisory signal in self-supervised learning to estimate the remaining portion of the input …. More knowledge of the world’s structure can be acquired from self-supervised learning than from [other AI paradigms] since the data is infinite and the sum of fees can be acquired.” Unsupervised learning has also had a transformative effect on the production of natural languages. Thanks to a modern unsupervised learning framework known as the Transformer, which emerged about three years ago at Google, NLP has seen tremendous progress recently. (For more on Transformers, see # 3 below.) At an earlier point, attempts to extend unsupervised learning to other areas of AI continue, but rapid progress is being made. To take one example, a company called Helm.ai aims to leapfrog the pioneers in the autonomous vehicle industry using unsupervised learning. As the key to improving human-level AI, many researchers see unsupervised learning as the key. According to LeCun, “the biggest challenge of the next few years in ML and AI is mastering unsupervised learning.” 2. Federated Instruction Unplash Data privacy is one of the overarching problems of the modern age. Since data is the lifeblood of modern artificial intelligence, problems with data privacy play an essential (and sometimes limiting) role in the trajectory of AI. Methods that allow AI models to learn from datasets without compromising their privacy, privacy-preserving artificial intelligence, is thus becoming an increasingly important pursuit. Federated learning is perhaps the most promising path to privacy-preserving AI. In early 2017, researchers at Google first proposed the notion of federated learning. Interest in federated learning has exploded over the past year: in the first six months of 2020, over 1,000 research papers on federated learning were released, compared to just 180 in all of 2018. Collecting all training data in one place, mostly in the cloud, and then training the model on the data is the traditional approach to creating machine learning models today. But for most of the world‘s data, which can not be transferred to a central data repository for privacy and security purposes, this approach is not feasible. This causes traditional AI methods to be off-limits. By flipping the conventional approach to AI on its head, Federated learning solves this issue. Federated learning leaves the data where it is, spread through multiple devices and servers on the edge, instead of requiring one single dataset to train a model. Instead, several iterations of the model are sent out and locally trained on each subset of data, one to each computer with training data.
https://medium.com/datadriveninvestor/the-next-generation-of-artificial-intelligence-44b2564eaa61
['Shaik Sameeruddin']
2020-10-17 18:04:22.197000+00:00
['Artificial Intelligence', 'Technology', 'Machine Learning', 'Deep Learning', 'Data Science']
294
A Guide to Effective Sales and Communication
A Guide to Effective Sales and Communication Quick tips on how to sell better and get things moving in your organisation. Photo by Tim Mossholder on Unsplash The art of selling becomes more prevalent and essential as you progress further in your career or expand your business. When I started as a developer, I only had to sell solutions to my team. After moving onto a consultant role, I had to sell solutions to my team and clients. Now, being in a management role, I find myself constantly having to sell ideas/solutions to my team, partners/vendors, clients, senior management, and other related stakeholders. Progress through Sales I used to have the misconception that management does a lot of talking while the analysts contribute more to the progress of projects and get things done. Having gone through the transition, I’ve started to appreciate the art of sales in contributing to progress. Below are some examples of progress through sales in a business/team: Obtain support and buy-in from stakeholders — project proposals and funding requests for new digital initiatives, support from other departments, i.e. branding and marketing, risk assessment, solution implementation, etc. — project proposals and funding requests for new digital initiatives, support from other departments, i.e. branding and marketing, risk assessment, solution implementation, etc. Offload priorities and venture into new fields — the nature of a digital lean startup is to explore new, potentially disruptive solutions constantly; frequent handovers are required to release capacity for new innovation and ventures. — the nature of a digital lean startup is to explore new, potentially disruptive solutions constantly; frequent handovers are required to release capacity for new innovation and ventures. Recruit capable team members for expansion — positioning the team’s initiatives, culture and achievements in the right light to attract potentially suitable candidates. This applies to partner engagements as well. The above examples are some highlights which I profoundly resonate with at this stage of my career. The list goes on — the point is that almost everything I do emphasises on selling effectively. Enhance Sales and Communication In my short experience so far, I have found some general guidelines which have been helpful in communicating an idea across better. A large part of selling involves conveying our intents clearly across to the stakeholders. Here are three tactics to enhance your sales communication: Appeal to interest — in every conversation or email, always put yourself in the shoes of the recipient and ask “What’s in it for me?”; you can explore putting the question as a placeholder text to guide and contextualise your thoughts. Use visuals, numbers and examples — charts, tables, and statistics are extremely useful in making your case concrete and “legit” — do your homework and reach out to the relevant parties to obtain numbers. Impression plays a great role in getting buy-ins. Keep things short and concise — most important stakeholders are perpetually busy; it’s recommended to begin by going straight to the conclusion/point (address the “What you need from me?” or “What do you suggest?”), and subsequent salient details in point form (limit to about 3). You can include further details and justifications upon request. Applying the above-mentioned tactics require skills and deliberate practice. You’ll have to constantly put in time and effort in sharpening your sales sword. I have found these approaches useful in my attempt to get better: Write frequently and address the right questions — helps to organise, clarify, and guide your thoughts. — helps to organise, clarify, and guide your thoughts. Seek reviews/feedback iteratively — find someone experienced to review your email or pitch deck drafts to highlight points for improvements. — find someone experienced to review your email or pitch deck drafts to highlight points for improvements. Read often and see how others do it — to better present information via charts, etc. you can consider looking at examples of how top consulting firms present research information and sell solutions and ideas to senior stakeholders. Pitfalls to Avoid It’s also important to bear in mind some of the pitfalls that could negatively affect your progress. Here are three pitfalls which I have faced throughout my journey so far: Fear, procrastination and not learning from feedback — a large part of sales is jumping in with two feet and dance to whatever music is playing. Without taking action, you won’t get feedback, which results in a lack of learning and improvement. — a large part of sales is jumping in with two feet and dance to whatever music is playing. Without taking action, you won’t get feedback, which results in a lack of learning and improvement. Ignore the interest of stakeholders — a key challenge is managing and balancing stakeholder’s interest. Often there may be compromises to be made, but it’s ultimately your job as the driver of an initiative to manage the expectations since you’re aware of the big picture. A lack of support from a stakeholder could pose a threat to your objectives. — a key challenge is managing and balancing stakeholder’s interest. Often there may be compromises to be made, but it’s ultimately your job as the driver of an initiative to manage the expectations since you’re aware of the big picture. A lack of support from a stakeholder could pose a threat to your objectives. Overwhelm someone with information — I realised that when the request is too lengthy, there’s a high chance that it gets ignored and unread; this leads to confusion and the need for reminders and follow-ups, which may prompt further clarification — resulting in unnecessary timeline delays and negatively affecting stakeholder’s interest. I hope you won’t make the same mistakes I did.
https://medium.com/the-internal-startup/a-guide-to-effective-sales-and-communication-384c5b540a21
['Jimmy Soh']
2020-06-30 08:19:38.685000+00:00
['Entrepreneurship', 'Startup', 'Technology', 'Presentations', 'Communication']
295
SVM: Feature Selection and Kernels
Data points on one side of the hyperplane will be classified to a certain class while data points on the other side of the hyperplane will be classified to a different class (eg. green and red as in Figure 2). The distance between the hyperplane and the first point (for all the different classes) on either side of the hyperplane is a measure of sure the algorithm is about its classification decision. The bigger the distance and the more confident we can be SVM is making the right decision. The data points closest to the hyperplane are called Support Vectors. Support Vectors determines the orientation and position of the hyperplane, in order to maximise the classifier margin (and therefore the classification score). The number of Support Vectors the SVM algorithm should use can be arbitrarily chosen depending on the applications. Basic SVM classification can be easily implemented using the Scikit-Learn Python library in a few lines of code. from sklearn import svm trainedsvm = svm.SVC().fit(X_Train, Y_Train) predictionsvm = trainedsvm.predict(X_Test) print(confusion_matrix(Y_Test,predictionsvm)) print(classification_report(Y_Test,predictionsvm)) The are two main types of classification SVM algorithms Hard Margin and Soft Margin: Hard Margin: aims to find the best hyperplane without tolerating any form of misclassification. aims to find the best hyperplane without tolerating any form of misclassification. Soft Margin: we add a degree of tolerance in SVM. In this way we allow the model to voluntary misclassify a few data points if that can lead to identifying a hyperplane able to generalise better to unseen data. Soft Margin SVM can be implemented in Scikit-Learn by adding a C penalty term in svm.SVC . The bigger C and the more penalty the algorithm gets when making a misclassification. Kernel Trick If the data we are working with is not linearly separable (therefore leading to poor linear SVM classification results), it is possible to apply a technique known as the Kernel Trick. This method is able to map our non-linear separable data into a higher dimensional space, making our data linearly separable. Using this new dimensional space SVM can then be easily implemented (Figure 3). Figure 3: Kernel Trick [3] There are many different types of Kernels which can be used to create this higher dimensional space, some examples are linear, polynomial, Sigmoid and Radial Basis Function (RBF). In Scikit-Learn a Kernel function can be specified by adding a kernel parameter in svm.SVC . An additional parameter called gamma can be included to specify the influence of the kernel on the model. It is usually suggested to use linear kernels if the number of features is larger than the number of observations in the dataset (otherwise RBF might be a better choice). When working with a large amount of data using RBF, speed might become a constraint to take into account. Feature Selection Once having fitted our linear SVM it is possible to access the classifier coefficients using .coef_ on the trained model. These weights figure the orthogonal vector coordinates orthogonal to the hyperplane. Their direction represents instead the predicted class. Feature importance can, therefore, be determined by comparing the size of these coefficients to each other. By looking at the SVM coefficients it is, therefore, possible to identify the main features used in classification and get rid of the not important ones (which hold less variance). Reducing the number of features in Machine Learning plays a really important role especially when working with large datasets. This can in fact: speed up training, avoid overfitting and ultimately lead to better classification results thanks to the reduced noise in the data. In Figure 4 are shown the main features I identified using SVM on the Pima Indians Diabetes Database. In green are shown all the features corresponding to the negative coefficients and in blue the positive ones. If you want to find out more about it, all my code is freely available on my Kaggle and GitHub profiles.
https://towardsdatascience.com/svm-feature-selection-and-kernels-840781cc1a6c
['Pier Paolo Ippolito']
2019-09-12 13:23:44.671000+00:00
['Technology', 'Artificial Intelligence', 'Machine Learning', 'Data Science', 'Programming']
296
All Bank Account Balance Check App
You can check you bank account with bank apps . there are many apps available for it . Some apps can help you to manage multiple apps from one app. i am mentioning one app . you can do number of things with it like 1 ) Check bank balance 2 ) Check mini statement 3 ) Call bank customer care 4 ) Find nearest banks 5 ) Find nearest Atms 6 ) And More and more for free. This App contain top American and Indian Banks. Here is this App , Click below to open app
https://medium.com/@ssingh09001/all-bank-account-balance-check-app-960bb22c2e2e
[]
2020-12-23 07:21:40.090000+00:00
['Banks', 'Banking Technology', 'Bankruptcy Attorneys', 'Banking', 'Bankruptcy']
297
A Blizzard-Developed Take On ‘Souls’ Feels Inevitable
A Blizzard-Developed Take On ‘Souls’ Feels Inevitable It feels like it’s only a matter of time until Blizzard makes a game that follows in the footsteps of Dark Souls Fergus Halliday Follow Dec 16, 2020 · 8 min read Although Activation-Blizzard’s recent string of mass firings might point you in the other direction, the reality is the corporate machine behind one of PC gaming’s most beloved studios is ramping up its production capacity in a real way. For a long time and prior to its acquisition by Activision, Blizzard was known mostly for two things: hardcore PC games like World of Warcraft, and releasing games when they’re done. If you were a Blizzard fan, you might get a new game once every three or four years and you’d be happy about it. Nowadays, that’s become less and less the case. New esports and casual-friendly franchises like Overwatch and Hearthstone have blended their way into the company’s DNA; classic series like Diablo have found new homes on consoles like Xbox, Playstation, and Nintendo Switch; and things like Warcraft 3: Reforged and the BlitzChung controversy have been a sobering reminder that sometimes even Blizzard can mess up a good thing. To put a finer point on it, the Blizzard Entertainment of today makes more games and tends to do so faster than its legacy counterpart. In fact, according to Blizzard president J. Allen Brack, the company has “more live games and unannounced projects than at any point in the company’s history.” In a further departure from the company’s legacy of PC-oriented gaming, it’s probably the case that many of these projects are likely going to be mobile games. After Call of Duty: Mobile, Activision-Blizzard is hungry for a larger slice of the mobile gaming market. Speaking in a recent investor call, Activision Blizzard president and COO Daniel Alegre admitted the company plans to take all of their popular franchises to the mobile market. However, irrespective of whether or to what degree the future of Blizzard is mobile-focused, it feels like it’s only a matter of time until Blizzard makes something that follows in the footsteps of From Software’s enormously popular Dark Souls series. Blizzard has always been known for making good games but they’ve rarely had a reputation that backs up the notion that they make particularly innovative ones. Before World of Warcraft, there was Everquest. Before Hearthstone, there was Magic: The Gathering Online. Before Overwatch, there was Team Fortress 2. Blizzard has rarely been the company that breaks new ground or invents new genres but they’re usually the one who takes the time necessary to do it justice. Their strengths have always been in the execution and finding the right balance of accessibility, depth, and polish. Within that context, a Blizzard-developed take on the Souls formula makes a lot of sense. They didn’t invent the Souls-like but they could probably make a pretty compelling riff on that formula and there’s certainly a market for it.
https://medium.com/super-jump/a-blizzard-developed-take-on-souls-feels-inevitable-9868818b2713
['Fergus Halliday']
2020-12-16 13:26:08.067000+00:00
['Features', 'Gaming', 'Digital Life', 'Technology', 'Videogames']
298
PORK: A Technology Resilience Framework
The story of technology today is complexity. Increasingly complicated software deployed in complex and unpredictable environments that seem to be slipping more and more out of our control. Daunting stuff, this complexity. A Prediction: The code you’re writing today won’t be the same code running a year from now. The machines hosting your services today won’t be the same machines in a year. Your services will fail occasionally. Your business partners will change their minds about features. Hardware will need security patches to address vulnerabilities. Network blips will disrupt and change the topology of your system. Your end users will interact with your applications in unexpected ways. Your business will scale significantly (hallelujah!), but load will be uneven. Intraday releases will mean services temporarily restart at inopportune times More examples of complexity, you ask? No problem! Distributed systems, consensus algorithms, machine learning. Team collaborations, human emotions and miscommunications. Feature enhancements, bug fixes, new competition, vendor integrations and organizational restructuring. All this, and we’re just scratching the surface of the complex factors that could impact our technology systems. Engineers are working within a truly unpredictable and complex environment from a technology, business and human perspective. We should acknowledge this head-on by building systems to be responsive to an uncertain future. Hungry For Resilient Software Given the new reality of complexity and constant change, our approach to building technology systems must adapt. At Rocket Mortgage Technology, we’re striving to transform the mortgage industry as well as the technology that runs it. In my role as a Software Architect, I work with a suite of amazing teams that build custom software solutions for our Capital Markets and Treasury business partners. These business areas are critical to the success of our organization, so we need our systems to reflect that criticality. Due to the volatility of the financial markets and the speed with which the mortgage industry changes, this is an exciting and ever-present challenge. Our mission in architecting systems, therefore, is to build systems that are both resilient and flexible enough to turn on a dime. Our mission in architecting systems, therefore, is to build systems that are both resilient and flexible enough to turn on a dime. What Is Resilient Software? Let’s define resiliency as the measure of how a system withstands stress and responds to unforeseen errors and urgent feature requests. More broadly, resiliency is a measure of how a system embraces change. As the needs of our clients change, our systems must change accordingly. This article details my teams’ approach to building resilient systems through our PORK Resiliency Framework. More than 100 billion dollars of loan originations flow through our systems each year, so the stakes are high. Our framework is constructed of four principles designed to take the fear out of decision-making in this complex and business critical environment. For several years now, these four principles have been the cornerstone of the systems we build and the measuring stick to guide our technology decisions. Engineers, architects, product owners and leadership have rallied around a shared goal to deliver resilient software, and these principles have led the way. The Four Principles Of PORK Our teams strive for fast, frequent and confident releases of our software, but how do we balance the safety of our system with the rapid change that comes with frequent releases? Well, imagine if you could say to your engineers, “Don’t be afraid of making mistakes because we have the tools in place to quickly find and easily correct them!” This is the essence of our framework. We strive for resiliency by observing errors as early as possible, then recovering quickly (ideally before impacting our clients in the first place). With this approach, we’re giving our engineers the runway to experiment, innovate and make mistakes. At the same time, we’re relieving the pressure from our engineering teams as they no longer need to aim to be perfect, bug free or predict the future. To liberate our team members from the fear of mistakes is to empower them to move at a higher velocity and deliver business value faster. So, what are the four principles of the PORK resiliency framework? Predictability Observability Recoverability Keep it simple Predictability Just because we’re working in an unpredictable and complex environment doesn’t mean our code should be unpredictable, too. Our first core principle asserts that we value predictability over correctness. This may appear counter-intuitive at first glance, but an engineering team should design a system to behave the same way in all circumstances. If a system is wrong but wrong in the same way every time, then we’re well-positioned to quickly diagnose and fix the problem. Predictability also enables the team to better explain, support and evolve the system over time. We believe that the same inputs fed into a system in the same order should always result in the same output. Why is this deterministic behavior important? Replicating a bug is often the hardest part of an engineer’s job. We have great talent on our engineering teams, and they’re adept at writing code to fix a bug — that’s typically the easy part. Most of the battle can be the task of reproducing the problem in the first place. With a focus on predictability, we can take a notoriously tricky task and make it trivial. If something goes wrong in a predictable system, it’s simple to pass the same inputs through our code again and replicate the behavior so we can diagnose what went wrong. From there, we can leverage our automated toolchain to fix it. This approach may sound simple, but there are a lot of forces at play here. To make your system predictable, you’ll need to delve deep into the underpinnings of your system design, likely pursuing advanced techniques, such as immutability, retry policies, idempotency, event-driven programming and other techniques. Challenge yourself and your teams to make engineering choices that reinforce predictability throughout your technology stack, from code to infrastructure and the space between. How do our teams achieve this? We invest in several tools and patterns, which I’ll discuss later in the article. Observability Broadly speaking, observability means we’re designing systems with visibility and diagnostic tooling in mind. Lack of a cohesive and reliable observability strategy complicates how we triage issues, impacts system stability and ultimately erodes trust between a technology team and its clients. If we’re confident we can identify when our system is behaving normally (healthy), then we should also be able to identify when something is behaving out of the norm (unhealthy). Upon observing unhealthy behavior, we can alert our support teams proactively and correct it before it impacts our clients. By contrast, if we don’t have the appropriate visibility into what our system is doing, then we don’t know when something is misbehaving, and we don’t know when it warrants our attention to fix it. This almost certainly leads to unfortunate client impact, and what’s worse, our beloved clients are the ones notifying us that the system is broken. Shouldn’t we already know? Observability is about quick and effective root cause analysis, not a few flashy visualizations to impress your leadership team. Rather than getting distracted by polished graphs and charts, I recommend you focus on a chain of diagnostic tools. We want three types of tools in this toolchain: Proactive alerts that call out unhealthy trends Tools to form a hypothesis of what has gone wrong (e.g. dashboards for logs, metrics, traces) Tools to confirm or deny the hypothesis Once you have confirmed your hypothesis, you can leverage our principle of Predictability to replicate the problem and test your bug fix. Voila! If we’re aware we have a problem (Observability), and we can quickly replicate and diagnose the problem (Predictability), then we are in good shape to correct the problem quickly. Recoverability When building a system, engineers have a tendency to try to make it bulletproof against failure. This seems reasonable at first glance, but ultimately proves to be unreasonably harsh and restrictive. Engineers should be thoughtful and defensive in their coding practices, but if they’re not careful, bulletproofing can lead teams down a path of fear, long-lived testing environments, days or weeks of extensive testing and planned maintenance windows. This approach will undoubtedly manifest itself in brittle systems and slow software delivery. As we shift our mindset toward embracing unpredictability, rather than avoiding failures at all costs, we should instead be thinking about how quickly we can respond to inevitable failures. You cannot avoid failure all together, but you can very much control how quickly a team responds to it. We need not bulletproof our systems if we can quickly observe system issues and repair them before our business partners or clients feel the impact. Recoverability is particularly important when considering how teams manage the data flowing through their systems. Data, in many ways, is a system’s most precious resource and the backbone of our business. If your system encounters an outage, chances are you’ll lose some data and need to recover it. Have a recovery plan in place for this in advance! Have a plan to repopulate your data in caches, event streams and data stores. For example, can you work with the system of record to provide a full “state of the world” snapshot to repopulate all your data on-demand? You should have layers of protection in place and, ideally, you should practice how you use them. Don’t wait for a production outage to hatch a recovery plan. Do it now. Don’t wait for a production outage to hatch a recovery plan. Do it now. We leverage many additional techniques for recovery, as well, such as approaching system design with a disposable mindset. The disposable mindset encourages teams to avoid building precious, brittle systems. Instead, support teams can quickly tear down a misbehaving application and spin up a fresh new application in its place. During critical outages, we favor this approach for quick recovery so our clients can get back to work as quickly as possible. Later, once the urgency has diminished, our engineers will delve deep into the observability data we have collected to explain what led to the outage in the first place. A disposable mindset often requires stateless services, a careful plan around managing your data (as discussed above) as well as some modern tooling like Docker containers and container orchestration, which I’ll briefly discuss later. My recommendation is to shoot for the moon and build a reset button that will quickly restore an unhealthy system to health in an automated fashion. This will certainly take a lot of work, but it’s a noble goal. Remember, due to the unpredictable nature of our complex environments, you can’t avoid failure, but with principles such as these, we learn to build resilient systems from unreliable parts. Keep It Simple Simple, elegant solutions are less brittle, less likely to break and easier to fix when they do break. As we design and implement systems, we’re typically addressing multiple goals across multiple time horizons at once. These include the initial delivery of a new system, but also the ease of maintaining the system for the years to come and the ease with which our system can evolve over time to meet our clients’ evolving needs. Given the time-shifting, shape-shifting nature of these goals, unless we have a healthy respect for complexity, the software project is almost certain to fail. In my experience, the single biggest indicator in the success of a software project is simplicity. It’s also the single biggest factor in reasoning about complex systems when you’re urgently alerted to a production issue at 3:00 a.m. Simplicity allows for a shared mental model across your team. Can every engineer on your team draw a reasonably accurate diagram of how your system is architected? I hope so. Our systems act as a record of all the hundreds and thousands of decisions we’ve made along the way. Build the simplest thing that will solve the problem. Be deliberate in your decision-making process and be able to justify your decisions. Because we must live with these decisions, maintain them, support them, debug them — it is in our collective best interest that we keep it simple. Tools For The Job As described, the principles of our PORK Resiliency Framework are conceptual in nature, thus not overtly opinionated with regards to programming languages, utilities or technology stacks. That said, I’ll share some tooling and best practices that my teams leverage today to achieve the goals of our framework. We expect these tools will change over time, but we also expect the four core principles will continue to apply, nevertheless. Infrastructure As Code (IAC) We use Terraform, which helps provision predictable infrastructure to avoid hard to debug environmental drift issues, which arise from manual configuration across multiple environments. Continuous Integration/Continuous Delivery (CI/CD) No surprise here. Tooling in this space is widely available to introduce automation, which offers us predictable and immutable build artifacts (for example, Docker images) and predictable promotion across environments. We use CircleCI and custom tooling to achieve this across our systems. Docker In our world, Docker refers to the containers as well as the orchestration layer that manages those containers. We love Docker as it promotes a disposable mindset, which ensures all dependencies are documented in code, and prevents snowflake environments. If a Docker container dies, another will be spun up dynamically by the container orchestrator (Docker Swarm, Kubernetes, ECS) to replace it. Because containers are immutable, the new one is just as good as the prior. The orchestration platform also offers us readily available infrastructure (compute on demand), not to mention a variety of predictability-focused tooling such as software defined networking, declarative files to describe the desired state of your system (Docker Compose, Kubernetes manifests) and a number of other abstractions to help with engineering productivity and resilient system design. F# We use C# in many scenarios, and C# is used more broadly across our company, but we love F# in this specific arena. F# is a functional language that addresses many of our predictability concerns elegantly (through first-class constructs around immutability and side-effect-free processing). That said, a functional language is not mandatory. You can certainly achieve predictability through C#, Python or other languages, though it will require more deliberate decision making and the engineering discipline to stay the course when the predictability path is not the easiest way forward. A functional language makes sense to us because we can emphasize immutability, testability and other predictability concerns at the code level, just as we already do at the service level and the system level. This allows for a cohesive predictability strategy up and down our technology stack. Kafka We love Kafka because, like the other tools mentioned here, it makes good engineering practices simpler to implement. It’s easy to integrate and deprecate services, so your system can evolve as needed over time. We believe immutable data is easier to manage, and Kafka is a durable log of immutable events kept in strict order (per partition). Your teams don’t need to reinvent the wheel and write custom code to achieve resiliency through buffering, redundancy, scalability and fault tolerance. These and other useful patterns and principles come out of the box with Kafka. Furthermore, Kafka allows us to process data as events happen in the real world, rather than wait for a user request. Thus, we’re processing earlier and can observe anomalies sooner. This flexibility allows us to fix issues even before the user invokes a request (thus, no client impact). When we do encounter a bug, Kafka allows us to rewind the consumer offset and replay the data inputs we received earlier. This functionality allows us to diagnose the problem easily and get back to health, as described in our predictability and recoverability principles. Splunk We use Splunk for several reasons across our organization, but in the context of PORK, Splunk helps us achieve our observability goals using logs. At a high level, Splunk helps us describe, explain and alert on the health of our systems. We use Splunk for building dashboards and alerts that show anomalies. We leverage the Splunk DSL to quickly drill down into request-level context to troubleshoot and diagnose production issues as they’re happening. Grafana Metrics help us describe and alert on the health of our system, but unlike request-level logs, it struggles to explain the state of our system because metrics intentionally strip away the context of individual requests. We love Grafana, nonetheless, but we focus it where it brings the most value — quickly and efficiently graphing trends that don’t require drilling down into request-level data. We find that Grafana is great at visualizing resource utilization (CPU, RAM, disk) and HTTP endpoint status codes. Summary With the PORK Resiliency framework, we’re attempting to address the increasing complexity of technology and our environment by turning the design of a system on its head, such that we no longer over-emphasize the initial delivery of a system. Instead, we ask ourselves: How will our system respond to an urgent business need down the road? How will we react when we have a bug? Do we have the right tooling in place to respond quickly and confidently? The answers to these questions will reveal how resilient a team’s systems truly are. Resiliency, after all, is a quality that applies not only to the systems but to the teams that run them. How do you ensure resiliency in your systems? Let me know in the comments!
https://medium.com/rocket-mortgage-technology-blog/pork-a-technology-resilience-framework-745207bd28d5
['Rocket Mortgage Technology']
2020-09-15 21:58:15.184000+00:00
['Technology', 'Developer Tools', 'Software Development', 'Software Engineering', 'Software Architecture']
299
Blockchain applications in Airlines
Currently, travelers are required to show their IDs at multiple checkpoints — from entering the airport to the luggage drop-off to buying stuff from the duty-free shops. Further, the long queues at security check-in and border control checkpoints are some of the other key pain points that travelers have to go through. Blockchain technology has the potential to simplify this process and eliminate the need to shuffle your bag repeatedly for taking out IDs and various other documents for verification. Digital ID on Blockchain With a Blockchain-based system, a user will be able to create his digital ID, entirely controlled by him. In other words, he can decide who can view his information stored on the ID, and he can even decide what part of the information should be shared with any specific viewer. For this, first, the user will be required to download the Blockchain-based Digital ID creator app for uploading his personal verification documents like passport, visa, etc., and biometric data like fingerprints, eye scan, and voice. This data will then be uploaded on the Blockchain, and a unique hash will be generated for the user’s information. Creating digital ID on Blockchain Now, this data will be sent to a government agency for verification. This agency will check and verify the user’s data against the central database and will either approve it or disapprove it. The verification status of the uploaded documents and ID will get stored on the Blockchain. When a user needs to take a flight, after reaching the airport, he can then simply go to a self-check-in security booth, where the user will be required to share his verified digital ID. After this, the user’s biometrics like the fingerprint or facial recognition will be captured by the booth and will be verified against the verified digital ID provided by the user. On verification, a QR code will be generated, which can be scanned by the user through his mobile app and can be used in further rounds of verification done at the airport. Thus, saving him a lot of precious time and unwanted hassle to take out his passport and other documents again and again to verify his credentials. At the self-check-in booth, a QR code is generated after the verification of digital ID, which can be used for further verification done at the airport In the case of international travel, the user can also share his verified digital ID with the border agencies well before his arrival dates. This will initiate the risk assessment in advance, and the actual process would be much faster and smoother when the user actually arrives at the border site. As the digital ID is government verified, the user can even be saved from the hassle of showing the passport. Baggage Tracking through Blockchain What comes to your mind when you think of travel. Packing your bags and leaving for that vacation you have been planning all month long. With your bags ready, you leave for the airport thinking about the wonderful beaches that you will hit or the unconquered mountains you will conquer. But as you leave your bags at the baggage drop-in counter, a slight sense of skepticism creeps in, and you become slightly worried about your luggage carrying all your important stuff and gears. You just wish that it gets loaded on the right plane, and in case you have a connecting flight, your concern reaches the next level. And all of this happens because you have no clear visibility of your luggage. In the current system, there is no way to track your luggage even at a single point during your whole journey. The main reason behind this is the lack of data exchange among the parties involved in the luggage transfer. The responsibility for every luggage item repeatedly changes during the journey. And when the baggage changes hands, the relevant information is uploaded and stored in their respective local and private systems. This information stored in the local systems is not shared with other stakeholders. Thus, making the backtracking of any luggage difficult and complicated. This is especially applicable to multi-stop flights because an increased number of airports and authorities are involved in the luggage transfer, which in turn makes the data sharing complex and confusing. When baggage changes hands, information is stored in local systems, which is not shared with other stakeholders But in a Blockchain-based system, the data stored will be secure, immutable, and accessible to all the stakeholders involved. Each bag will be marked with a unique code or number, and each traveler will be given a corresponding unique number to track their luggage. When the bag moves through a security scanner or tracking system, the data will be uploaded and recorded on the blockchain. Every stakeholder on the Blockchain can track the entire journey of the baggage. Through this, every passenger can see the location of their bags in real-time. This will give him the mental satisfaction that his bags have been loaded into the right plane, even during multi-stop flights. Further, it will also fasten the tracking of lost baggage as every stakeholder will have a clear picture of the luggage, its last tracking point, and the authority responsible for it. Thus, leading to a significant improvement in the customer experience for the flyers. Tracking bags in real-time on the Blockchain Redeeming Loyalty Points on Blockchain Loyalty points and air miles are some of the very crucial ways through which airlines tap repeated customers. But in the current system, customers have to wait until they have substantial loyalty points accrued in their account. Further, the usage of these points is limited to very specific places, which leads to a lot of points being left unused or expired. But with Blockchain-based royalty points, the points can be redeemed at various other partner outlets. Thus, increasing the usage and attractiveness of these points to the consumers. Very recently, a Blockchain-based loyalty program has been launched by Singapore Airlines for its flyers. They have developed a digital wallet named Krispay, in partnership with Microsoft and KPMG where their flyers can convert their air miles into units of payment that can be used at partner outlets in Singapore. Customers are provided with a mobile app. Using this app, the customers can convert their miles into units of payment and use these units to pay at registered outlets by simply scanning a QR code. Flight Insurance Payout using smart contracts Many airline carriers and their travel partners sell flight delay insurance along with flight tickets. But when the flight gets delayed, we don’t know whom to approach to get our insurance claim. The whole claim process is opaque, and no one knows what steps are involved or what compensation they will get if their flight gets delayed. But with Blockchain and smart contracts, the process will be simplified. This has been implemented by AXA, where Flyers can buy flight delay insurance by paying a premium. Under this insurance scheme, if the flight gets delayed by more than 2 hours, the smart contract gets executed, and the insurance claim amount gets automatically transferred in the accounts of the flyers. Flight Insurance payout on Blockchain For buying this smart contract-based insurance, a person needs to first register their flight details on the service provider’s platform and fill in their identity and account details. Then they can pay and buy the insurance plan. And if the flight gets delayed more than the stipulated time defined in the smart contract, an automatic payout is triggered by the smart contract into the flyer’s bank account. This payout is done without any manual intervention from any of the stakeholders involved. Thus, making it swift and transparent. Thanks for reading!
https://medium.com/techskill-brew/blockchain-applications-in-airlines-36cc7b86afe5
['Techskill Brew']
2021-12-17 07:05:06.938000+00:00
['Airlines', 'Blockchain', 'Aviation', 'Blockchain Startup', 'Blockchain Technology']