title
stringlengths
1
200
text
stringlengths
10
100k
url
stringlengths
32
885
authors
stringlengths
2
392
timestamp
stringlengths
19
32
tags
stringlengths
6
263
Me After Twelve Hours With My Son or Hannibal Lecter: Who Said It?
Me After Twelve Hours With My Son or Hannibal Lecter: Who Said It? “I imagine your little brother must smell almost as bad as you do by now.”
https://medium.com/frazzled/me-after-twelve-hours-with-my-son-or-hannibal-lecter-who-said-it-9483a5c710ee
['Elle Rogers']
2020-07-29 12:01:01.734000+00:00
['Humor', 'Lists', 'Satire', 'Miscellany', 'Parenting']
Mental Health Workers: The Other First Responders in this Pandemic
Mental Health Workers: The Other First Responders in this Pandemic Why mental health workers need safeguarding and protection more than ever Photo by Ani Kolleshi on Unsplash In response to the COVID-19 outbreak, there has been a lot of talk in the media about the importance of preventing overload to our health care system. What we are only starting to hear about is its (anticipated) overwhelming impact on mental health services… and its providers. Mental health workers are just as prone to burnout as any other care providers. Given the predicted impact of this pandemic on the well-being of virtually everyone, we need these service providers to be stronger than ever. Mental health workers need protection now, to ensure that they are available and prepared for the coming mental health crisis. Nobody is immune to the pandemic’s negative mental health effects. Social distancing measures keep loved ones apart, businesses and schools closed, people without steady work or income, and other services overloaded. The stress to essential workers and their families is obvious. We are living in a time of great uncertainty, with no end in sight. Psychiatrists Dr. Andrew L. Smith of The Ottawa Hospital and Dr. Neil de Laplante of the University of Ottawa support that while social distancing measures are necessary, they could “lead many of us to feelings of isolation and powerlessness”, which are both linked to anxiety, depression, and possibly suicide. The pandemic could be detrimental for those already facing complex mental health challenges: chronic anxiety, trauma, delusional paranoia, and addictions to name a few. There are people without the cognitive ability to understand the situation. Some are without a working memory. And others could be further subjected to daily abuse. Health care workers and first responders are also high-risk groups for mental health challenges during this pandemic. They face prolonged exposure to high-stress work environments, in addition to the personal struggles we all face waiting out COVID-19. Post-trauma effects are a real threat to this group. Positively, there are mental health resources specific to first responders, but I wonder how accessible or effective these services will be long-term. With the anticipation of a mental-health crisis secondary to this pandemic, mental health services could be more important than ever. Undoubtedly, we will need a strong system to help those most at-risk to the consequences of this pandemic, and to continue caring for our most vulnerable persons. To maintain a strong mental health system, we must protect and safeguard its most important resource: its workers. Without its workers, the mental health system will breakdown, and this is already starting to happen. As a mental health worker, I can tell you that many of us will be or are on the verge of burnout, the longer this situation continues as is or worsens. The state of the world is pushing people into survival mode; those who feel powerless and unprotected will resort to fighting for themselves. In survival mode, one can hardly help his or herself let alone others. In my field, workers are already walking off the job for self-preservation. I am a front-line mental health worker supporting persons who require 24-hour care. I work in a home in close proximity to about 8–11 people (workers and residents) during my shift, where social distancing is near impossible, and PPE availability is extremely limited. Last week, I was informed for a second time that someone in the home would be tested for COVID-19. I was left to wonder what will happen if the results are positive. How will we manage a potential outbreak at work? Will I be put off work to isolate? Have I exposed the elderly person I live with to the virus? What will I tell my family if the result comes back positive? For confidentiality reasons, I cannot give specific details. I can tell you that at this time, there is no confirmed case of COVID-19 at my place of work. But this is a possibility that I need to be prepared for, and yet, there is very little certainty on what will happen if there is an outbreak. Not knowing what to expect leads me to feel powerless and anxious. Still, I try to remain strong and composed for the people I support, who depend on my care. I support people who have a very limited understanding of what this pandemic means. For them, it means a lot of changes to regular routine, a lack of contact with the outside world, and a lot of waiting and boredom. Without structure and purpose, their risk of self-destructive behaviour increases. It cannot simply be explained to them what might happen if there is an outbreak in the home. This could create further anxiety, paranoia, and undue stress. To the best of our ability, my front-line team tries to keep routine in the home, offer alternative activities, and practice a lot of supportive counselling and patience. Most importantly, we try to keep individuals safe and healthy, not only from the virus, but from mental health crisis. This means that we as essential workers need to keep ourselves safe and healthy. We must be so careful not to come in contact with the virus and bring it to work. Conversely, we are trying not to bring the virus home to our loved ones should there be an outbreak at work. Just as important, we must look after ourselves mentally, so that we can continue to work effectively for those who need us. This is easier said than done, given the circumstances. We have limits, and a lot of us will be prone to burn out the longer this situation continues to negatively impact us both personally and professionally. We are human, not super heroes. I think I speak for most mental health professionals during this time. I have helpers and healers in my own life who I know are struggling personally in this pandemic. One even said she is cutting back on work to look after herself and her family. I know front-line workers who are thinking about a leave of absence or extended holiday at some point, knowing that they cannot sustain working under stressful conditions long-term without hitting “reset”. Some have even opted for early retirement to protect their health and their loves ones. Meanwhile, many of us are currently working overtime, and this field was facing a staffing crisis even prior to the pandemic. We are only in the beginning stages of this pandemic, and essential care workers are already walking away. Front-line staff walked out of a group home in Markham, Ontario last week, leaving residents unattended. While I cannot imagine doing this, there could be a number of reasons why this happened, and I would guess that these people did not feel safe or protected. If mental health workers all burnout, what replaces us? When mental health crisis reaches its peak in response to the pandemic and overloads mental health services, what happens then? As Dr. Andrew L. Smith and Dr. Neil de Laplante note, the mental health system is already “chronically under-resourced”, and vulnerable persons face the greatest consequences if the system breaks-down in response the pandemic crisis. Funding is necessary to keep mental health resources in place or develop new services as the effects of the pandemic evolve. Additional funding is required to implement health and safety measures to protect people living and working in group-homes, institutions, and long-term care facilities. But money is not the only important resource. In addition to funding, mental health workers need safeguarding and protection. The most important resource in the mental health system is its workers. Without anyone available to provide the service, funding loses its purpose. Eventually, COVID-19 will no longer be a significant health threat. However, we can expect a long-term impact on society. Loved ones will be lost. People will struggle to recover financially. Some might be traumatized. Vulnerable persons could face significant abuse in isolated situations. Addictions could surge. Not everyone will come out of this pandemic being able to continue on their previous life. Mental health workers are not miracle workers, but they are necessary when people are in crisis, which will be an inevitable result of the current situation. So, how can we protect and safeguard the most important resource in the mental health system? Really, how can we better protect all essential workers right now? Protecting Rights In emergency situations, employers are allowed (under government Acts) to override collective agreements to “protect” the safety of its workers or clients. For example, employers might be allowed to temporarily change an employee’s job classification or location to provide support where needed. Unfortunately, this gives employers a lot of free reign, rendering unions who protect employees almost powerless. Employers should still be held responsible for their decisions. Yes, temporary changes may need to be implemented under extraordinary circumstances. But as much as possible, employers need to consider the mental well-being of their employees. This means listening to employee concerns and respecting their right to refuse unsafe work. If we are facing a pandemic that will impact us for months to a year, employers need to ensure that they do not make decisions that cause undue long-term stress to their employees. Otherwise, they risk being left short of those essential persons. Compensating According to Risk Appropriate compensation to me means that employers consider unforeseen risks associated with jobs and what that means for their employees. While I do agree that wage increases should be implemented in some cases (kudos to these companies), there are other ways in which essential workers can be compensated or supported during this pandemic. Increased paid “personal days” for full and part-time employees will allow essential workers to take time off to care for themselves and their families. Extra sick days should also be provided if a worker comes into contact with COVID-19 on the job. Expectations that employees use their vacation time or go without pay are unreasonable when workers are coming into contact with an unforeseen risk associated with the job. Protecting Health Government and employers need to take more responsibility in ensuring that essential workers have what they need to stay healthy right now. Sourcing products like disinfectants, gloves, and hand sanitizer, and ensuring that essential workers have these items will reduce at least one stress associated with this pandemic. I support that everyone needs access to these items, but front-line workers especially need them to keep themselves and the people they live and work with safe. Additionally, initiatives could be set forth to ensure that essential workers have the means of obtaining nutritious foods for themselves and their families. Employers can only benefit from keeping their workers strong during these times. Providing Solutions We are in a time of great uncertainty, and I can respect that experts in health and government are dealing with a situation that they have never handled on this scale before. I do believe that there should be honesty and transparency regarding the pandemic and its predicted course. What is concerning is the presentation of scary problems without action or solution. Advising everyone to “get used to this way of life” (i.e., social distancing) for months to a year or longer is not practical or helpful. This is not sustainable. We all know this, and it will become increasingly difficult to support people through this pandemic without long-term solutions. Mental health workers can help people manage and cope with the situation they face, but that only goes so far. Until people start to feel hope and control, until they can re-establish some normalcy in their lives, the looming mental health crisis will grow. Photo: PublicDomainPictures from Pixabay The only way to prevent or minimize this crisis is to come up with alternatives to just “staying home”. — — — Nothing about this pandemic is simple. There are no easy answers. I respect that most advisors and decision-makers, at all levels, are doing their best. We need to work together to see the situation from all angles, to prevent further crisis. The people who make up health care services are the most essential resource to safeguard. In the coming weeks, it is imperative to everyone’s well-being that action plans are implemented to keep workers safe and healthy. Now is the time to help and protect these workers, so that they are available and strong when they are needed.
https://medium.com/the-front-lines/mental-health-workers-the-invisible-first-responders-in-this-pandemic-8a4e92b53370
['Courtney Lenora']
2020-05-10 15:09:05.612000+00:00
['Mental Health', 'Pandemic', 'Coronavirus Blog', 'Human Parts', 'Psychology']
I had a Baby and I guess I’m Ok
I had a Baby and I guess I’m Ok Yesterday I cried. It was one of those nights, that you experience as a new mom when your newborn is crying all night and you have tried the feeding, the diaper change, the burping, the rocking your child while walking around in circles around the living room for what seems to be hours and nothing is working. It’s one of many nights that may be like this as both my child and I settle into this new life together. It would be one of those anecdotes I share with friends that are expecting in the future just like my family and friends shared with me. That night when all you can do is cry because you are overwhelmed and there seems to be no solution. I’m a shower crier. I don’t like to cry in front of others. I don’t like burdening others with my emotional baggage so I work through things on my own. I cry in the shower and I write and I keep both to myself. Thanks to my husband who took over the “find out whats wrong with our newborn” duties, I was able to briefly escape and head into the shower where I would have a good cry and move on with my night. As the tears started streaming down my face and mixing with the water from the shower, I started to feel all of these different emotions. It had occurred to me that since I had the baby, I had been operating on pure adrenaline. I was on this constant “fight or flight” mode for seven days since my water broke and I had walked into the hospital and I had not come down until this very moment. I started to process everything that I had experienced at the hospital a few days ago. I replayed the events in my head from beginning to end like I was processing some sort of trauma. And it was traumatic. So why do I feel so guilty saying that? For most of my life, women would speak about pregnancy and child birth like it was the most beautiful experience one can go through. And I agree that it is absolutely the best feeling in the world to feel a life growing inside of you. I built the most incredible bond with my daughter as she was in my belly, way before she was born. It was absolutely magical. But to speak about this experience and say that it was “beautiful” and just leave it at that would be doing a disservice to those women who have not gone through this experience. In my case, as a mother delivering a child during the current pandemic, I had an extra layer of stress to add. When I pictured my delivery experience, I had always pictured my mom there as well as my partner of course. I pictured family waiting in the waiting room with excitement for news that beautiful baby had been finally born! Well…due to the coronavirus there was non of that. I had been informed a few weeks ago that the hospital would not be allowing visitors during my stay. Not having my mom there was a tough pill to swallow but I knew that these safety precautions were in place for our protection so I accepted it. For a brief moment I was informed that I would have to give birth completely alone as the hospital had decided to ban partners in the delivery room as well. This was THANKFULLY reversed by the Governor before I gave birth but during my labor I couldn’t help but think about it every time my husband held my hand when I was getting a contraction, or every time he made me drink water or forced me to eat Jello. What would I have done if he hadn’t been there? After 20 hours of labor and a few minutes of pushing, my little girl was finally placed on my chest. I can’t explain the emotions I felt to finally meet the little human that had been moving inside my body for months. After what felt like a few minutes (although I’m not really sure how long it really was) of skin to skin, my baby was grabbed by the nurses put under lights, inspected, poked, flipped, needless to say after being in the quiet safety of my womb and thrust into the world in this manner, she was absolutely terrified. By this point, my epidural was slowly fading. I started to shiver and it became progressively worse as the time went by. I started to feel all of the pain that the epidural had numbed me from before. The catheter that had been placed in my body hours earlier was removed and I was asked to see if I could use the restroom on my own. I couldn’t. After my failed attempt in the restroom, I walked out and caught a glimpse of my husband holding our baby, who had fallen asleep in his arms. She finally felt safe again. Once again, the thought came into my mind…What would this scene have looked like if my husband had not been here? The nurse turned to my husband and said “not to kick you out, but you’re aware that you can’t stay at the hospital after the labor, correct?” He knew. We both knew and I thought I had emotionally prepared for this but I guess I hadn’t prepared enough because in this moment, I felt absolutely distraught. I sat in a wheel chair ready to be rolled away to my room for the next 2 days. My baby was rolled in a bassinet next to me. I said my goodbyes to my husband and I thought to myself “how am I going to do this?” At this point, my pain was at a 10. I could barely move and the idea of taking care of a newborn baby, something I had never done before, seemed absolutely impossible. I knew I had no choice and all that was left for me to do was pull through. I asked for pain medication and put my “big girl panties” on for the next 2 days. I put a brand new pair of “big girl panties” when I arrived home and I have had them on ever since.
https://medium.com/@kemelyc/i-had-a-baby-and-i-guess-im-ok-daf14c34ae43
[]
2020-05-12 18:09:25.298000+00:00
['Quarantine', 'Motherhood', 'Pandemic', 'Birth', 'New Mom']
Socket to me
Socket to me Illustration and design by Casey Labatt-Simon. Socket Mode enables you to build powerful Slack integrations safely behind your company’s firewall and unlocks access to the platform’s most interactive features — from the Events API to Block Kit to Workflow Builder. Today, millions of daily active users get exponential value from bringing their work into Slack. For developers, this means as Slack grows, you grow with us — sharing in our goal to improve the way teams collaborate through business-critical integrations. Socket Mode solves Socket Mode enables a quick and secure way to start building apps without additional infrastructure management. Previously, anyone could use the Slack platform to build apps based on the standards and protocols of the open web, using HTTP to send and receive messages and other data. This didn’t require any special tooling or software — fundamentally, a Slack app is a web app. However, for anyone hosting their apps on-premise or behind a firewall, this presented a challenge for integrating with Slack — specifically when routing Slack API data to apps in a restricted environment. Some customers built proxies to safely transfer packets from the internet to their network. Others created WebSocket connections using the RTM API, but the RTM stream primarily contains message data and doesn’t support many of the richer interactive features of the platform like events, Block Kit, or shortcuts. For software engineer at Dell Technologies, Bob Bell, using Socket Mode means not having to manage additional layers of software. “This let us replace a layer seven proxy, which required coordination to set up,” Bell said. “With Socket Mode, we can just immediately start writing a Slack app.” Develop locally, deploy anywhere Any app, new or existing, can connect over Socket Mode. The logic stays the same, and the payloads remain identical — regardless of delivery method. We often recommend tooling like ngrok to manage local development. Now you can enable Socket Mode, connect from your local machine, build out the logic of your app, then deploy to a web server and switch back to HTTP. Enable Socket Mode through the settings section of your app’s configuration page. Apps can choose the delivery protocol, WebSocket or HTTP. It’s a binary switch that gets flipped for your app — which can happen at any point and flip back if necessary. Your app simply needs to be able to handle either HTTP or WebSockets. With Socket Mode enabled, an app connects to Slack by establishing a secure WebSocket connection. From here, all dispatches from Slack are sent to your app over this WebSocket and nothing is sent via HTTP. Build better with Bolt After enabling Socket Mode, all that’s required for your app is a WebSocket connection. There are no new proprietary protocols or custom SDKs required — industry standards all the way. Socket Mode support is available in JavaScript, Java, and Python through our SDKs and Bolt, a Slack-first framework for building apps. Using Bolt, new and existing apps can enable Socket Mode by simply adding a few new lines of code. If you’re building an app without an SDK, fear not! Here’s a quick implementation guide to help you along the way. Getting started To build with Socket Mode, visit our API documentation for an introduction to some core concepts, the new app token, and related sample code. Need more hands-on support? Join our webinar later this month. Sign up to save your spot today. We look forward to whole new class of Slack apps hosted everywhere. Happy building!
https://medium.com/slack-developer-blog/socket-to-me-3d122f96d955
['Jim Ray']
2021-01-12 19:47:15.069000+00:00
['Apps', 'Slack', 'Security', 'Development', 'Announcements']
65 Websites to Boost Your Productivity
It can be said that time management is one of the keys to success. In this sense, it is important to work efficiently in addition to working hard. That’s why I have listed websites that can increase your productivity in many areas. I use most of the websites on this list regularly and I hope you will find some websites to benefit from now on. Here is the list: A great, easy to use screenshot tool for Windows, Mac and Linux. Plugins for browsers are also available. 2. Slidesgo I discover this recently. I have been using this regularly since then for my slides. Their templates are awesome, easy and FREE! You can find templates for Google Slides and Powerpoint for almost every subject. If you are someone who uses slides too often, then look no more! 3. PDFescape This one is really useful. I used it all the time. If you find Acrobat Reader too complicated to edit your PDFs, you can check out this easy to use and free tool. Works on Windows and browsers. 4. Ifixit A car, laptop, tablet or camera… This site shows you how you can repair your staff. They also offer an illustrated guide. And no, They do not own by Apple. Not yet. 5. Grammarly Perhaps no need to mention but I still wanna add to this list. It is perfect for writers to check grammar mistakes, spelling errors etc. I highly recommended it if you are not using Grammarly already. 6. Marker Do you enjoy using highlight feature on Medium? Then this one is for you. With this Chrome extension, you can highlight words and sentences anywhere on the web. Great for long, complicated articles! 7. Picmonkey A simple Photoshop alternative to editing your photos. 8. WolframAlpha Sometimes the regular search engines won’t enough. If that is the situation, you can check out this search engine instead. They use a unique algorithm and artificial intelligence to give the best search results. There is also a mobile app. 9. Letsenhance Thanks to this free website, you can make your low-resolution photos more clear. The results are really impressive! 10. Which Date Works Group works can be annoying for obvious reasons. This website can help you with that. You can create group plans so everyone can figure out what and when to do it. 11. Similarsites You find a website that works for you. Good. Now you can find similar websites with this website. For apps, you can check out this one. For movies, TV shows and music in here. 12. Hundred Zeros With this website, you can read free kindle books. 13. Ge.tt One of the easiest ways to share files on the web. 14. Free Images This is really useful for Medium to find copyright-free high quality stock photos. 15. Teuxdeux Fascinating to-do list tool! I find popular to-do list tolls are too complicated then what it should be. A simple, minimalistic interface makes a difference. 16. Ifttt This is really useful if you use social media frequently. Ifttt helps you with managing all of your social media from one point for free. Works for especially business. 17. Lastpass Lastpass has millions of users already but I thought there must be quite people who never heard of. With this password manager, you can say goodbye to creating a solid password each time and trying to remember when you sign up for a website. 18. Cloud Convert A great tool for converting any kind of file. Audio, video, document, ebook, archive, image… Name it. Can be a lifesaver. 19. Bonanza A background remover. No, unfortunately not for your past. You can not get away from your past! You can remove the background from your image though. 20. Open Culture This is one of my favorites. You can find free movies, audiobooks, podcasts, language lessons and courses. They describe their website as “the best free cultural and educational media on the web”. I say well put it! Photo by Marvin Meyer on Unsplash 21. Wallhaven A magnificent wallpaper source for your articles, desktop and your phone. I use this regularly. Highly recommended. 22. Pocket You can save articles, videos or any other content from a web page or app. If you are constantly consuming a lot of content online, this is a must-have. 23. Ninite Switching a new computer can be hard to process. You always have to install a lot of programs. Ninite here for you to save the trouble. From this website, you can install the most popular programs at once. Works on only Windows. 24. Mailinator Opening an email is an easy process these days. How about getting rid of them? Not so much. This website helps you with that. You can destroy your email with Mailinator. 25. Easybib This one is perfect for academicians. With this tool, you can easily make quotes and bibliographies. There is also a plagiarism and grammar check feature for your paper. Cool ha? 26. File Pursuit This is a search engine for documents, videos, audios, eBooks, mobile apps and archives. I use it regularly. Can be very beneficial. 27. Eggtimer Do you forget things regularly? You can use this simple timer then. 28. Audiotrimmer An excellent tool for cutting and editing your mp3 files. Great for making ringtones! 29. Howstuffworks If you are like me, always curious about how things work, then this website is definitely for you. You can learn a lot of things by just surfing on this website. Can be useful for research also. 30. Visitacity This is for travelers. When you travel somewhere you often want to use your time and money wisely, right? When you specify the city you will go to and the number of days you will stay, this site provides a program for you. Definitely worth checking. 31. Myfonts This website lets you find out what the font is in the image you upload. 32. Calligraphr Do you have fancy handwriting? You can create fonts from your own handwriting through this site. 33. Quora This is one of my favorite social networks. You can ask questions or answer questions about pretty much everything. 34. Mathway Math problems can very tricky. This site helps you with solving math equations in seconds. 35. Futureme You can write a letter to the future yourself. Almost like time travel. Almost. 36. Cleverbot Bored talking to Siri? You can talk with this AI-oriented bot instead to practice English. 37. Yummly With this website, you can access millions of recipients from all over the world at once place. Also, when you type in whatever ingredients you have, you can see which dishes you can make. And that user-friendly interface is awesome. 38. Coursera Online education getting more important, especially with the current Covid-19 epidemic. The website offers you free courses from respectable universities in the world. If you are willing to pay a certain fee, it also gives you certificates. 39. Codecademy You can learn coding for free. 40. 10minutemail You don’t wanna give your personal email every time you have to sign up? Use this one instead. Photo by Nick Morrison on Unsplash 41. Tasteofcinema Troubled to find a good movie to watch this night? From this website, you can find numerous film lists in different genres, times and categories. 42. Worldometers Real-time world statistics. Coronavirus updates, word population, health, social media are some of the categories. 43. PhET A website where you can learn complex scientific topics with short and simple simulations. 44. Ctrlq Search engine to find RSS feeds. 45. QR Code Generator A qr code generator. You can use URL, phone number or business card. 46. Keepmeout Are you addicted to a website and trying to reduce your visiting time? Check this one out then. 47. Songsterr A great tool for guitar players. This website shows you step by step where to put your fingers on the guitar for the music you want to play. 48. Deletionpedia From the website, you can access deleted Wikipedia articles. 49. Nap The National Academic Press offers you their huge academic database. You can search for academic books, articles or journals for various academic divisions. 50. Foxyutils Foxyutils not for everyone. However, if you are dealing with pdfs every day, then you should be familiar with protected pdf files. With Foxyutils, you can open and edit protected pdf files. 51. Crunchbase It is a platform where you can find detailed development processes, initiatives and future plans of almost all enterprises. 52. Musclewiki Exercise is great, especially these days when we spend a lot of time at home. So how about working out efficiently? Through this website, you can choose the muscle you want to work out in your body and learn what you should do for that specific muscle. Easy, practical and efficient. 53. Xe An easy to use currency converter. 54. Bigjpg You can increase the size and resolution of your pictures for free. 55. Inhersight Inhersight is a platform for women to learn the working conditions in pretty much any company in the world. 56. Writewords This website allows you to quickly group words in any text by number. 57. Bulkurlshortener A free and effective url shortener. Great for Twitter. 58. Chronas This is really cool. By choosing any region from this world map, you can find out which wars took place in that region in which year. You can also access Wikipedia information about the matter. 59. Numbeo Do you wanna move to another city or country? You should definitely check that out then. Here, you can find the monthly living cost of the city or country you wanted to move in. And yes, the website has a compression feature. 60. Howlongtoread This website shows the average time it takes to read your chosen book. Photo by krisna iv on Unsplash 61. Manualslib You can quickly access the user guide of the products. 62. Typingstudy Need to type faster on the keyboard? Through the lessons on this site, you can learn how to write faster and more efficiently on the keyboard. 63. Voicedocs You can easily convert audio recording into text. 64. Justgetflux If you are a person who spends a lot of time in front of the computer during the day, you should definitely check this out. This tool protects your eyes by adjusting the backlight for different time periods of the day. Works only on Windows. 65. Archive As they wrote in the description of the website, “Internet Archive is a non-profit library of millions of free books, movies, software, music, websites, and more.” Definitely one of my favorite corners on the internet.
https://medium.com/illumination/65-websites-to-boost-your-productivity-196b854f3922
['Mustafa Yarımbaş']
2020-07-23 22:27:38.584000+00:00
['Technology', 'Personal Development', 'Writing', 'Self', 'Productivity']
Corona Hair, Courage, and Meditation
Photo by Cottonbro from Pexels What do Corona hair, courage and meditation have in common? Whelp, like most people across the globe, I’m sheltering in place here where I live. This means foregoing luxuries like hair salon visits. And while I realize this is a first-world problem (of course) it feels less so when you’re venturing into an entirely new arena, which for me is getting on camera and talking about things like my past, life survival tips, and ways to build emotional and mental resilience. For me, it takes tremendous courage to step into the limelight. This woman who spent her childhood singing and dancing on stage has become accustomed to blending into the background. And add Corona hair to the mix, and well, you can see how much I’d rather blend with the wallpaper or the carpet (whichever I can disguise myself against more easily) than get on camera. Why isn’t shapeshifting a thing? I’ve spent almost 40 years practicing the art of resilience. It started with the Herman Hess book “Siddhartha”. My mother handed it to me one day while I was crying about taking a test. I think. The details are hazy. But I remember she asked me to read it. I was 7. You see, in second grade I was already at a 6th grade reading level so reading more advanced books wasn’t completely out of the realm of possibility. But, I know you’re probably thinking this one was a little out there. And you’d be right. It was. It still is. What can I say? My mother was a hippie and well you fill in the blanks. :) Now I realize that she was trying to give me something she herself couldn’t quite verbalize. Maybe her own intuition told her I was ready. So she handed me books the way a doctor prescribes medicine. And in my eagerness to please her, I read them. After reading Siddhartha, I became fascinated with meditation. My mother showed me the basic lotus position and from there I started my practice. Around this time I also started journaling. I think I’ve shared before that got my first journal when I was 7 at a store in a mall called Fluff n’ Stuff. I bought it with my allowance and it was one of the proudest moments of my entire young life. It was a j ournal and it was adorable. From that point on, writing and meditation became every day things for me. When I got older and moved to Boulder, CO to attend college I immersed myself in studying Buddhism and started training in the Vajrayana tradition at the Shambhala Mountain Center. Later, I trained with Ed Brown at Green Gulch Farms in the Zen tradition and also studied under Reb Andersen. This is when I took up a mindfulness practice studying Jon Kabbat’s Zinn’s work at Kaiser. Since then, I’ve taught meditation and creative journaling to humans of all sizes. With so much happening now, I feel compelled to teach again and share some of the wisdom that helped me create the Milo series and also what is helping me navigate these overwhelming and frightening times. OK, so remember when I told you guys that no one really knows what they’re doing? Well, I’m happy to say that’s true here. I have complete beginners mind. I’m living moment to moment, embracing the inner critic, laughing at myself and enjoying my pink hat. :) Sending so much love to you all. https://www.youtube.com/watch?v=4gLN7zFWlFo&feature=emb_title
https://medium.com/@jmartinbloom/corona-hair-courage-and-meditation-99f6bac0d33b
['Jenna Martin']
2021-01-05 12:58:11.924000+00:00
['Life', 'Covid 19', 'Resilience', 'Shelter In Place', 'Meditation']
A Brief History of Crypto Gaming
If you drew a Venn Diagram showing the crossover between gaming enthusiasts and cryptocurrency investors, there would be a significant mutual middle section. When cryptocurrencies first began to enter the public consciousness, the gaming community were some of the earliest adopters. Cryptocurrency democratises investing and trading; tearing down the barriers to entry that you would normally run into if you wanted to trade on the stock market. As such, it created the environment for tech-savvy investors to cut out their own space in the investment world. The Crypto Prophecies is one of the latest, and most highly anticipated blockchain games on the market. But how and when did crypto gaming first come about? Here, we take a look at what blockchain gaming is and take a brief look at its history. What is Blockchain Gaming? As the name suggests, blockchain gaming is a term to describe games that are hosted and operated on blockchains such as Ethereum. Blockchain games are distributed via open-source programs and provide gamers with the opportunity to own their in-game assets in the form of NFTs which are bought and traded using the user’s cryptocurrency wallet. The introduction of NFTs as an asset differs from the ‘soft gold’ that you may have encountered on most other mobile, PC or console games. As blockchain games are decentralised, they are also democratised. These games belong to the people who are trading assets within them. Blockchain gamers form a co-operative of sorts; they own the game through their involvement in it rather than being ‘owned’ solely by the developers. Early Blockchain Games CryptoKitties CryptoKitties is one of the earliest incarnations of what is considered a blockchain or crypto game. In 2017, CryptoKitties introduced the concept of unhackable, unique assets that are bought, sold, and traded in-game. The premise of the game was to breed and sell virtual cat NFTs for cryptocurrency. One cat reportedly sold for over $100,000! HashCraft HashCraft, created by gaming institution Ubisoft, is a creative, sandbox game in which players parachute into a randomly generated island and begin crafting their own world. An island then goes public and is shared across the blockchain so that everyone can explore them. You can set challenges for explorers of your island to complete, which can see either you or them win cryptocurrency. Botwars Ultimate Trading Created by the team soon to bring you The Crypto Prophecies, Botwars sees you leading an army of robots to conquer the in-game trading markets. You can win real cash and crypto prizes and equip your robots with a vast range of weapons, shields and ammo. Botwars is currently being extended to offer ingame cryptocurrency for wagering in pvp and the ability to buy and own robots and weapons as NFTs. Axie Infinity Released in 2018, Axie Infinity is one of the most popular blockchain games on the market. The game seems to be peaking in popularity and is currently raking in huge sums of money. In Axie Infinity, players collect, breed, trade and battle creatures called Axies. These creatures are digitalised as collectable NFTs. Start Your Crypto Prophecy With The Best New Crypto Price Prediction Game The Crypto Prophecies is one of the newest and most highly anticipated blockchain crypto price prediction games. Join our community and prepare for battle in the legendary Battle Arena, where you can grow your wealth, win wagers and hone the skills of your Crypto Prophets. Sign up today and be one of the first players to start your crypto prophecy.
https://blog.thecryptoprophecies.com/a-brief-history-of-crypto-gaming-3144f780de9c
['The Crypto Prophecies']
2021-08-10 18:04:55.621000+00:00
['Gaming', 'Blockchain Game', 'Nft', 'Nft Collectibles', 'Cryptocurrency']
The Future of Customer Service
Customer service standards have shifted dramatically over the past decade, thanks to a combination of more competition in the market and an abundance of increasingly-accessible technology. All businesses understand that the customer experience is a powerful differentiator, and it absolutely will impact who customers buy from, how long they stay loyal, and even what they’re willing to pay. We know that customer service is essential to a business’s long-term success, but what exactly does the future of customer service hold? In this post, we’re going to take a look at how the pandemic has impacted customer expectations and what we expect for the future of customer service. How the Pandemic Has Impacted Customer Service Expectations were already high of customer service before the pandemic struck, with 54% of consumers from all over the world saying that their customer service expectations had risen year-over-year in 2017. The pandemic, however, had a significant impact on customer service. With lockdowns and social distancing requiring many businesses to operate online or remotely, customer service representatives via phone, email, and live chat became the face of brands. There weren’t in-person salespeople to build relationships or solve issues. Some businesses struggled with this, as call volume went up significantly, especially when supply chains and standard operating procedures were disrupted and impacted customers. One study found that “difficult” customer calls increased from 10% to 20% once the pandemic ramped up. This was partially due to unsolvable problems caused by the pandemic, though it didn’t help that customer anxiety and frustration as a whole had increased dramatically, too. Perhaps the most significant change, of course, was that many customer service agents were now working from home for the first time. As of right now, many still do. What We Think the Future of Customer Service Looks Like COVID-19 has clearly presented some challenges to the customer service industry that will likely shape it moving forward. We’ve also had an emergence of new technology, which in turn can alter consumer behavior. All of these factors are going to change the face of customer service in 2021 and beyond. Let’s take a look at what we believe the future of customer service will hold. Increased Accessibility Across Multiple Communication Channels If this year has taught us anything it’s that accessibility through multiple communication channels is a must. Have a Facebook Page for your business? Customers will message you there, and expect a response within 24 hours. They’ll also call you, email you, send messages through your site, and use just about any communication platform available to them. You need to have customer service agents manning all of them, but remember that even as more communication channels branch out, phone will still be central. Most customers prefer to talk to someone on the phone for urgent, important, or secure purposes. You don’t want to risk missing these calls or keeping people waiting for 45+ minutes on hold while their anxiety levels go up, so having strong support for your phone lines will be crucial. Businesses Will Prioritize Meaningful Connections with Customers Customers want to do business with brands that care about them. And while customers can be fickle, easily tempted to try the competition for a slightly lower price or a single less-than-stellar experience, one of the few things that can establish true loyalty is legitimate relationships. Small, human-to-human moments between a customer and a customer service agent can be a gamechanger, even if they’ll likely never talk to that agent again. Think about the last time you had a major customer service disaster. Maybe, for example, you had ordered an expensive $200 weighted blanket for your daughter’s birthday, only it arrived damaged two days before the big day. You don’t want an apathetic customer service agent to say “well we’ll refund you, sorry for the inconvenience.” Hearing someone on the other end of the line who genuinely seems to care, and who works to find a solution is what you want– and it’s what will keep you around as a customer. Personalization Will Allow Higher-Quality Service While we train our customer service teams to handle every foreseeable scenario, the reality is that each individual customer is unique, and every concern is unique. Someone who receives that weighted blanket in tatters will only want their money back, while others wouldn’t settle for anything less than a new blanket overnighted to them the next day so it can arrive on-time as a gift. Some may want quick action taken, while others may have questions about the quality of the product overall. Personalization in customer service means that your customers are being treated as individuals. And when your team has the ability to act as if each customer is an individual and not just adhering to strict scripts, everyone benefits. Personalization will continue to be a powerful force in the future of customer service, aided by stronger CRM tools to track customer histories to better find ideal solutions faster. Reliance On Third-Party Tools & Services to Offer Faster, Better Service A few weeks ago, I had to make a call to follow up with Discover about an issue with fraud on one of my cards. I’d already called before and was dreading needing to rehash the entire scenario over again. I didn’t have to. Even though I spoke to a new customer service agent, they pulled up my file and the case, and we were able to jump right back in. They were able to offer a solution that worked specifically for me, based on the situation. As a customer, I felt cared about, and a stressful situation was resolved quickly and easily. And they were able to do this because they had a strong CRM, which allowed them to take adequate notes during the first call that they could source during the second. And even better, the team members were clearly trained to use them. Third-party services like high-quality answering services are also becoming more popular, especially as customer service demands continue to increase. Customers are tired of waiting on hold when they’re used to fast responses through live chat, so having an answering service that can pick up every call in three rings or less can be a game-changer for your business.
https://medium.com/@patlive/the-future-of-customer-service-d71f6ce62afe
[]
2020-12-23 14:49:38.463000+00:00
['2021 Trends', 'Customer Service', 'Future Of Work', 'Customer Experience']
The (re)birth of American multilateralism
You’d be forgiven for missing it, but on Sunday a group of Asian nations representing the region’s leading economies signed the world’s largest trade agreement. The Regional Comprehensive Economic Partnership (RCEP), signed on November 15, includes fifteen nations — China, Japan, South Korea, and Australia, among others — and deepens regional economic integration, which will accrue financial benefits to the member states and to the possible detriment of the United States. Equally, if not more significant, the pact also places these nations more firmly within China’s orbit. Americans, by contrast, don’t like multilateralism — “it has too many syllables and ends in -ism,” as former Secretary of State Madeleine Albright quipped at a Georgetown School of Foreign Service event this week. Taking this to an extreme, the Trump administration, in accordance with its “America First” worldview on trade and foreign policy, withdrew from the Trans-Pacific Partnership trade agreement in 2017. The remaining nations then signed the Comprehensive and Progressive Trans-Pacific Partnership without the United States. Now, the RCEP further pushes the United States out of the picture on major global trade issues. These aren’t the only ways in which the Trump administration’s trade and foreign policies over the past four years have made the United States weaker, and exacerbated a number of broader global challenges. Drastic American turns away from multilateralism include: withdrawals from the Paris Climate Accord, the World Health Organization during the height of a global pandemic, and the Joint Comprehensive Plan of Action on Iran’s nuclear program even though Iran was adhering to the deal. The future depends on multilateralism There are numerous challenges — big and small, regional and global — that only multilateral efforts can solve or manage. For instance, as ISD has explored in reports over the last four years, issues of environmental security are increasingly a driver of migration around the planet; and climate change is heating (and thus opening up) the Arctic twice as fast as the rest of the globe, creating a new region of geopolitical stress. Moreover, our increasingly interconnected world is likely to see more pandemics like Covid-19 in the future. These problems are all global in nature, and cannot be solved unilaterally or even bilaterally. Read ISD’s working group report, The New Arctic (Image: Institute for the Study of Diplomacy) It is imperative that the Biden administration moves quickly to reverse the Trump administration’s go-it-alone, unilateral, and transactional foreign policies. The most significant and all-consuming initial work for the administration will revolve around slowing the spread of Covid-19, and collaborating with other countries on vaccine development and distribution. While this will entail a heavy domestic focus, there is also room for immediate moves by the new administration to reengage multilaterally by rejoining the WHO and participating in the COVAX mechanism. President-elect Biden has said that he will rejoin the WHO immediately upon taking office. Complaints regarding the organization and its occasional kowtowing to China are legitimate, but leaving the organization, as the Trump administration did, was not the correct answer, especially in the middle of a pandemic. If America wants to have influence in the organization and ensure its effectiveness and efficiency, it needs to be on the inside. On climate change, Biden has specifically noted that he will bring the United States back in line with the Paris Climate Accords on day one, which is the correct first step toward successful climate action. The rest of the world hasn’t stopped moving forward to combat climate change, but Biden’s plans to make it a first-order issue of his entire administration will provide a major boost to global carbon reduction efforts, and plan to go further than any previous administration. In addition, the Biden administration will likely move quickly on new negotiations for the United States to rejoin the Joint Comprehensive Plan of Action. Any durable nuclear agreement with Iran must be multilateral, as we saw in 2014 and 2015. The Biden team will almost certainly look to work with its partners to rejoin and then expand upon the original agreement. Any new efforts at nuclear summitry with North Korea should also take a multilateral approach. These are just a few examples, but they are by no means the only ones. The administration should also open up regular and serious discussion on Arctic security. As the High North becomes more navigable and exploitable, and as outside players such as China try to gain a larger foothold in the region, more systematic multilateral talks are needed to ensure a peaceful and sustainable future for the Arctic. Likewise, after four years of the Trump administration’s disparagement and ill-treatment of Latin American migrants, along with its lack of sustained diplomacy in the region, the Biden administration should pursue regional reengagement with near neighbors. This policy would seek to remove drivers of migration through more sustained efforts to bolster democracy in the Western Hemisphere, which include increased anti-corruption efforts, strengthening the rule-of-law, and counter-narcotics efforts, as a start. After four years of America First, the field is wide open for U.S. multilateral reengagement on numerous fronts. Thankfully, we will certainly see an early flurry of activity in this regard from the Biden administration.
https://medium.com/the-diplomatic-pouch/transition-note-the-re-birth-of-american-multilateralism-1a8ed2c2c527
['Kelly Mcfarland']
2021-01-14 21:42:58.018000+00:00
['Diplomacy', 'Joe Biden', 'United Nations', 'Multilateralism', 'Foreign Policy']
American Private Militia: They Want to Do More than Kidnapping Michigan’s Governor
On 8 October 2020, the United States Federal Bureau of Investigation dropped a bombshell when it announced that it had detained 13 homegrown terrorists belonging to the Wolverine Watchmen, an extreme right-wing private militia. These men planned to forcibly kidnap Michigan’s Democratic Governor Gretchen Whitmer through an armed seizure of power, and at the same time violently overthrow the state government of Michigan, which is now controlled by the Democratic Party. In addition to Whitmer, Virginia’s Democratic governor, Ralph Northam, who was once notorious due to a “blackface” photo in his yearbook, was on the group’s kidnapping list as well. Six of the men have been indicted on federal terrorism charges; the other seven have been indicted under state law in Michigan. According to the FBI’s affidavit, the plot was on their radar back in August, so they sent undercover agents to monitor every move of the extremists. In addition, according to information provided by the FBI and Andrew Birge, the federal prosecutor in western Michigan, confidential informants within the organization who worked with the police also reported the plot. Michigan Office of the Governor In addition to meeting in Ohio, the group had conducted military training in Michigan to “take over” the state, including how to take tactical cover and dismount and shoot. Brandon Caserta, one of the six federally indicted suspects, posted a video statement, explaining the reasoning behind Wolverine’s intent to kidnap, according to the video released in federal court on March 16. In the video, Caserta, 32, calls the Michigan government’s series of restrictions on the new crown epidemic in the U.S. “tyranny” and says he’s tired of being robbed and enslaved by the state, issuing tougher threats, “If this whole thing starts happening, I’m telling you man, I’m going to take out as many of those motherfuckers as I can, every single one of them. As he was led away, Caserta shrugged his shoulders and looked grimly at his aunt and stepbrother, who sat in the courtroom. Assistant U.S. Attorney Nils Kessler said the two videos show the accused militiamen preparing for the violent kidnapping of the governor. Kent County Jail Different politicians have made different statements about the incident. Detroit U.S. Attorney Matthew Schneider argued that Michiganders who differ on each other over political differences should not be violent. But the potential victim of this conspiracy, Whitmer, a mother of two, has not been so subtle in her willingness to maintain the political balance. Facing reporters and cameras, Whitmer, 49, wore an unconcealed weariness and anger on her face, and pointed the finger at President Trump, who has repeatedly abused and attacked her at briefings and on social media, as the seeds of this conspiracy. This horrific conspiracy caused her to abandon her belief that any pleasantries is required and cut to the chase, saying, “I know this job is hard, but honestly, I never imagined anything like this would happen.” Whitmer argued that Trump, who publicly refused to condemn white supremacy and hate groups at the presidential debates, has wildly emboldened these extremists. Such acts, she noted, hear the president’s “rallying cry” and then plot to kidnap themselves. Not only that, but after the press conference, Whitmer penned an op-ed in the Washington Post that further pointed out Trump’s and his campaign’s misappropriation and embrace of hate: shortly after Whitmer’s speech, President Trump’s campaign advisor Jason Miller appeared on Fox News, claiming she was using it as an opportunity to promote hatred of her Republican political opponents. In this article, Whitmer more bluntly denounced Trump and the Republican political ecology he has built, saying that when leaders encourage domestic terrorists, they legitimize their actions. On top of that, Whitmer stated that Trump, despite knowing the magnitude of the COVID-19 pandemic for a long time, led to the death of more than 212,000 people in the U.S. from the coronavirus through a series of failed managements and partisan political accusations; in addition, his all-caps posting with Twitter in April of “LIBERATE MICHIGAN” sparked violent protests in Lansing, Michigan’s capital city, by many white armed militias, including members of Wolverine Watchmen. Shealah Craighead / The White House The concept of the modern American militia movement originated before the founding of the United States in 1776, and has served to maintain community order and avoid outside attack since the colonies such as Jamestown were first established. When the guns were fired in Lexington and the War of Independence broke out, it was the signal for the American militia to go on the offensive. Today, the concept of the citizen militia is embodied in the National Guard in every state. Today’s private militia groups are a far cry from the founding era of this concept in a practical sense. They are not controlled or led by federal or local governments, but are simply private armed groups of civilians acting together under their own approval. In addition to defending their right to bear arms, which they believe is granted to them by the Second Amendment to the Constitution, many militia groups federal and or state governments have usurped powers granted by the Constitution that rightfully belong to the people, and have used this as a justification for their intent to overthrow existing governments (especially those controlled by the Democratic Party). In the 1992 confrontation at Ruby Ridge, Idaho, Randy Weaver, a former U.S. Army Special Forces soldier and member of the white supremacist group Aryan Nations, was surrounded by law enforcement for a weapons violation. In the ensuing gun battle, Weaver’s son and wife were killed by federal agents, and Weaver eventually surrendered. In 1993 in Waco, Texas, a similar tragedy occurred when law enforcement attempted to execute a search warrant of a Branch Davidian camp and its leader, David Koresh, for a weapons violation. It eventually culminated in a gun battle and a fire which Koresh, knowing he was about to lose, set himself on fire and burned 73 members of the Branch Davidians to death at the same time, including dozens of children. Although law enforcement is said to have committed nothing criminal in either case, many militia groups believe it was a harbinger that a civil war would break out and law enforcement would take over the constitutional rights of American citizens. Seth Herald / Reuters To understand the problem of domestic terrorism in the United States better, typified by Wolverine Watchmen’s kidnapping conspiracy, it is necessary to begin with April 19, 1995. It was a day that weighed heavily on many Americans: that morning, the Alfred P. Murrah Federal Building in downtown Oklahoma City burst into flames with a loud bang and a fireball rose in the parking lot next to the building. A truck containing 2,177 kilograms of ammonium nitrate explosives was detonated by Timothy McVeigh, a retired veteran who later described himself as a believer in white supremacy. More than one-third of the Murrah Building was completely destroyed, ultimately killing 168 people and injuring an unaccounted for number of people. In a subsequent confession, McVeigh showed no remorse and stated that he did so as revenge for those who died at Ruby Ridge and Waco. In June 2001, McVeigh was executed by lethal injection in a federal prison in Indiana. This senseless slaughter was the first time that many Americans, especially white Americans and members of the law enforcement, realized that these radical and violent people were practicing and enforcing a terrorist creed not dissimilar to militant Islamic fundamentalist groups. Mark Potok, a writer for the anti-hate group Southern Poverty Law Center, recalls that homegrown terrorism after Oklahoma actually didn’t stop at all, as some of them orchestrated to nearly blow up an oil refinery in Texas soon after, but 9/11 led to all law enforcement and governmental willingness to prevent terrorism being left to foreign terrorism. University of California, Berkeley law professor Leti Volpp discusses the 9/11 airstrikes in preventing homegrown terrorist attacks caused by radicalized militias and white supremacy in her 2003 commentary, The Citizen and The Terrorist. The stereotype of the “Arab-Muslim terrorist” creates a social divide against Muslims, Arabs, and others who look close to the predominant groups in the United States. The institutionalization of a system of discrimination based on racial profiling, designed to intimidate precisely this category of people, led directly to a heightened level of xenophobia shortly after being partially repulsed by the McVeigh bombing, allowing for a clandestine gathering of forces. According to the SPLC, these anti-government “patriot” militias have proliferated since late 2008, when Barack Obama was first elected president, and since the rise of Trump, their behavior and agendas have become closer to the tenets of white supremacy. At the 2017 rally in Charlottesville, many prominent militias served as security forces to maintain order for Ku Klux Klan members, neo-Nazis, and “alt-right” supporters, which for many journalists and researchers meant that they showed clear signs of integration (although, there were many militiamen who participated in the march firmly rejecting the white supremacist label). Twitter @JimKilbane Hatred of different groups based on race and other factors is a common organizing element of these militia movements. In addition to Proud Boys, who have recently gained attention for their campaign of street violence, two other powerful militia groups, The Oath Keepers and the Three Percenters, have publicly stated that they are the last line of defense against a New World Order that seeks to enslave ordinary Americans. According to their “New World Order” theories, the federal government is controlled by a secretive, globalized elite cabal that seeks to confiscate coercion, impose civil martial law, and set up concentration camps to kill dissidents. This theory has absorbed the unfounded slander of conspiracies against Jews in Europe over the centuries and has served as the tenet for their apocalyptic fables. In recent years, it has been further exploited by the extremist conspiracy theory QAnon, which supports Trump fervently, and has become a politically influential rumor. Trump becoming the head of the federal Government was a double-edged sword for these militias. On the one hand, they suddenly lacked the prerequisites to support the “New World Order” doctrine; on the other, Trump’s attacks on “deep state” and minorities soon provided them with new targets. Some of them began to forge ties with the nation’s most important anti-Muslim organizations, and began to express increasing hostility toward loosely affiliated groups like Black Lives Matter and Antifa, and to focus on the threat to their ideal order posed by non-white immigrants by scouring the U.S.-Mexico border for illegal immigrants. Disturbingly, these militias include a large number of former soldiers and law enforcement agency personnel. Two of the 13 people accused of plotting to kidnap the Michigan governor are retired Marines, including the group’s honorary leader, Joseph Morrison. An investigative media outlet found that nearly 150 current and retired police officers are members of the militia’s Facebook group; they often want police officers because they have guns, experience, and training. Running through the worldview of such militia groups, is the idea that military personnel and law enforcement officers represent the final word on the Constitution. Two other men charged in this case, Michael and William Null, were among the armed attendees at a May anti-seizure rally in Grand Rapids, Michigan, organized by the state’s Barry County Sheriff’s Department. An NBC News investigation of the suspects’ social media profiles also revealed that the Wolverine Watchmen are among the groups that believe in and promote the “Boogaloo” conspiracy theory of arming for civil war. In anti-government videos, Brandon Caserta, who refers to the government as a “tyrant,” often wears a Hawaiian shirt, one of the main symbols of Boogaloo’s identity. Shutterstock For these militias, it seems that they still maintain an originalist way of understanding the U.S. Constitution and Bill of Rights that took shape 229 years ago. After the COVID-19 pandemic struck the United States and the state governments imposed a lockdown, many of them took this to interpret that the government was challenging the boundaries of the Constitution. Some militia groups chose to refuse to wear masks and protest with arms; others, like the Wolverine Watchmen, chose to overthrow the government with arms, igniting an early civil war. In addition to these individuals, Kyle Rittenhouse, who gunned down two protesters in Kenosha, Wisconsin, under the guise of “protecting local businesses,” fits a similar definition. Seth Herald / Reuters The tolerance of law enforcement, the agitation of gun rights groups, the shadow of racism, and conservative readings of the Constitution have all been integral to the development of the militia community to the extent that it has. According to political science professor Jack Rakove’s understanding of the Constitution, the Founding Fathers of the United States, through the Federalist Papers and otherwise, were constantly engaged in constitutional debates, and it was only because of that they diverged from Britain due to their differing understanding of the enjoyment of civil rights. In other words, it was impossible for them to think that their words had to go to be understood in an originalist way. Yet this philosophy has been redefined since the rise of Reaganite right-wing thought in the 1980s by figures such as Supreme Court Justice Antonin Scalia. As a staunch conservative, Scalia believes that the U.S. Constitution and Bill of Rights must be known and enforced in the same way the Founding Fathers understood it when it was first written down. Following this way of understanding, in 2008, Scalia made a ruling that was crucial to the militia movement and to the entire gun rights issue. In the majority decision in the landmark District of Columbia v. Heller case, he wrote, as the majority party, that “The Second Amendment protects an individual right to possess a firearm unconnected with service in a militia, and to use that arm for traditionally lawful purposes, such as self-defense within the home.” However, in Scalia’s lengthy argument, he left no instruction manual for lower courts on how to protect this newly recognized right, thus implying that militia groups could legally train to arm themselves with the case’s ruling even if they did not comply with government regulation. In addition to Scalia, Amy Coney Barrett, a religious conservative who may soon be voting to nominally succeed Ginsburg on the Supreme Court seat but actually succeed Scalia, is also a follower of the “text, history, tradition” philosophy. She would very likely be arguing that judges need to rely on legal precedent for gun regulation, and that all challenges to the original text without legal The support of precedent is all unconstitutional. Shutterstock However, how can guns and forms of government be the same in 1791 as they are today? John E. Finn, retired professor of political science at Wesleyan University, argues in The Conversation that these militia groups, including conservative jurists who recognize their legitimacy, are deliberately misinterpreting the Constitution’s requirements for militias. Article 1 of the U.S. Constitution, which deals with militias, authorizes the federal government “may call out the militia in case of civil war; its authority to suppress rebellion is found in the power to suppress insurrection and to carry on war.” But it makes no provision for private militias, and no federal or state law has ever authorized an individual armed group to call itself a militia or to perform law enforcement or military functions. Ironically, from the respective state laws of all 50 states, this form of private militia is explicitly illegal in all 50 states. With President Trump and his right-wing media supporters openly refusing to condemn or acknowledge white supremacy and gun violence, incidents like the Michigan governor’s near-kidnapping are likely to continue to increase in the future. And as law enforcement continues to lack the zeal to enforce state statutes that explicitly disenfranchise private militia groups, their willful negligence will further amplify the misinterpretation and abuse of the U.S. Constitution by these private militias.
https://medium.com/discourse/american-private-militia-they-want-to-do-more-than-kidnapping-michigans-governor-e632c6b24c0b
['Allen Huang']
2020-10-20 13:50:25.776000+00:00
['Politics', 'Military', 'America', 'Guns', 'Covid 19']
Under-the-hood of GraphQL
1: Overview First we need to ask what is graphql? There are a couple of answers It is: A type system — type definitions define how data should look. They are written to show what is included and highlight the relationships (how things relate). This system definition language can be transformed into AST. A formal language for querying data — what to fetch from where. Rules for validating or executing a query against the Schema. Point (3) above references the official Specification which defines the rules for types, validation and executing the schema. It can be found at: There is also a documentation-friendly website graphql.org/learn All languages follow the spec, and the JS graphql library frequently references part of it. We will be looking at that library as part of this section. Building the schema The schema is an important part of a graphql application, as mentioned above it defines all types and their relationships. There are 2 steps to this 1. Parses “schema notation” (usually found in a schema.graphql file) into AST The parser will throw errors if it is not a GraphQL schema. See snippet from types/schema.js (below) export function isSchema(schema) { return instanceOf(schema, GraphQLSchema) } 2. Transform AST into objects and instances We need a schema which is a type instance of GraphQLSchema (see above snippet). We then need objects inside the schema which match types, for example a scalar or an object. Example of a scalar (below snippet) const OddType = new GraphQLScalarType({ name: "Odd", serialize(value) { if (value % 2 === 1) { return value } }, }) The native scalar types defined for GraphQL are ID , Int , Float , String and Boolean . You can define your own inside your type system. Summary Essentially for building the schema we turn this graphql schema notation: type Book { id: ID! title: String authors: [Author] } Into this Javascript const Book = new GraphQLObjectType({ name: 'Book', fields: () => ({ id: { type: new GraphQLNonNull(GraphQLID) }, title: { type: new GraphQLString }, author: { type: new GraphQLList(Author) }, }) } Almost all of the GraphQL types that you define will be object types. Object types have a name, but most importantly describe their fields. A GraphQLSchema looks like the below in raw POJO form: GraphQLSchema { astNode: { kind: 'SchemaDefinition', ... }, extensionASTNodes: [], _queryType: Query, ... _typeMap: { Query: Query, ID: ID, User: User, String: String, ... }, ... } It holds AST information too but the root Query and types are found under the _typeMap property. Adding resolvers Resolvers cannot be included in the GraphQL schema language, so they must be added separately. They are added to the _typeMap property, under _typeMap.<Type>._fields.<field> So after adding the resolvers, a schema object might look like below: e.g. _typeMap: { Query: { _fields: { users: { resolve: [function] } } }, User: { _fields: { address: { resolve: [function] } } } } From the HTTP request side the community has largely standardized on a HTTP POST method. But what happens when the server receives a query? Query lifecycle The GraphQL spec outlines what is known as the “request lifecycle”. This details what happens when a request reaches the server to produce the result. There are 3 steps that occur once the lifecycle is triggered. 1. Parse Query Here the server turns the query into AST. This includes: Lexical Analysis -> GraphQL’s Lexer identifies the pieces (words/tokens) of the GraphQL query and assigns meaning to each Syntactic Analysis -> GraphQL’s parser than checks whether the pieces conforms to the language syntax (grammar rules) If both these pass the server can move on. In graphql-js this is all found under the parser.js file and function. This query: query homepage { posts { title author } } Becomes the AST { "kind": "Document", "definitions": [ { "kind": "OperationDefinition", "operation": "query", "name": { "kind": "Name", "value": "homepage" }, "selectionSet": { "kind": "SelectionSet", "selections": [ { "kind": "Field", "name": { "kind": "Name", "value": "posts" }, "selectionSet": { "kind": "SelectionSet", "selections": [ { "kind": "Field", "name": { "kind": "Name", "value": "title" } }, { "kind": "Field", "name": { "kind": "Name", "value": "author" } } ] } } ] } } ] } You can see this for yourself on astexplorer under GraphQL. 2. Validate Query This step ensures the request is executable against the provided Schema. Found under the validate.js in the graphql-js library. While it is usally run just before execute, it can be useful to run in isolation. For example by a client before sending the query to the server. The benefit is that the validator could flag an invalid query before it is sent to the server, saving a HTTP request. It works by checking each field in the query AST document against its corresponding type definition in the schema object. It will compare argument type compatibility as well as coercion checks. 3. Execute Query This step is by far the most intensive and the step I often found the most confusing in its mechanism. We will be digging deeper into this step in part 2, so lets look at the process involved from a high-level. Identify the operations i.e. a query or mutation? (several queries can be made at once) Then resolve each operation. For step (2) each query/mutation is run in isolation. Resolving each operation For step 2 GraphQL iterates over each field in the selection-set, if it is a scalar type resolve the field ( executeField ) else recurse the selection-sets until it resolves to a scalar. The way this works is that the engine calls all fields on the root level at once and waits for all to return (this includes any promises). Then after reading the return type it cascades down the tree calling all sub-field resolvers with data from the parent resolver. Then it repeats this cascade on those field return types. So simply speaking it calls the top-level Query initially and then the root resolver for each type. The mechanism caled “scalar coercion” comes into play here. Any values returned by the resolver are converted (based on the return type) in order to uphold the API contract. For example a resolver returning string “123” with a number type attached to it will return Number("123") (i.e a number). This is found under the execute.js file and function inside graphql-js . Lastly the result is returned. Introspection system Its worth mentioning the introspection system. This is a mechanism used by the GraphQL API schema to allow clients to learn what types and operations are supported and in what format. The clients can query the __schema field in the GraphQL API, which is always available on the root type of a Query. Any interactive GraphQL UIs rely on sending an IntrospectionQuery request to the server, which builds documentation and auto-completion with it. Libraries As part of my research I covered many different libraries, so I thought it was worth giving a quick overview of the main ones in the JS ecosystem. The reference implementation of the GraphQL spec, but also full of useful tools for building GraphQL servers, clients and tooling. It’s a GitHub organisation with many mono-repositories. It performs the entire Query Lifecycle (including parsing schema notation). Schema Requires library specific GraphQLSchema instance. In order to be an executable schema it requires resolver functions. Example functions buildClientSchema Take output of introspection query and builds the schema out of it buildASTSchema Once you have parsed the schema into AST this then transforms into a GraphQLSchema type It’s an abstraction on top of graphql-js . Houses lots of functionality including generating a fully spec-supported schema and stitching multiple schemas together. Example functions makeExecutableSchema Takes arguments: typeDefs - "GraphQL schema language string" or array - "GraphQL schema language string" or array resolvers - is an object or array of objects - is an object or array of objects Returns a graphql-js GraphQLSchema instance loadSchemaSync Point this to the source to load your schema from and it returns a GraphQLSchema . addResolversToSchema Takes a GraphQLSchema and resolvers then returns an updated GraphQLSchema . apollo-server It’s also an abstraction on graphql-js . Uses the graphql-tools library for building GraphQL servers. Introspection is disabled on production by default but on non-production it uses introspection to expose a /graphql playground. Apollo Studio Not really a library but I thought worth a mention. Plugs into apollo-server and provides stats and information of your GraphQL server in realtime. Such as:
https://medium.com/@tabu-craig/under-the-hood-of-graphql-7980a169aa76
['Craig Taub']
2020-11-28 12:13:37.993000+00:00
['GraphQL', 'JavaScript', 'Nodejs', 'Apollo Server', 'Type Systems']
Hiring managers: How to make your UX team truly inclusive in 2021.
Benefits of inclusive hiring for both your team and business. Photo by Annie Spratt on Unsplash Imagine for a moment, that you spend your life constantly adapting everything from the way you brush your teeth to the way you socialize, work, and play. It takes a lot of time, energy, and planning to accomplish tasks that other people can do without a second thought. Having a disability makes you approach problems differently. It makes you think more creatively. It amplifies your other senses and skills to compensate for the ones that are not the same as your colleagues. You’re always thinking with accessibility in mind because you depend on accessibility to survive. This is just the tip of a massive iceberg of skills disabled designers possess. Skills that you can’t learn from a 1-hour webinar — because they’re rooted in life experience. Photo by Marcus Aurelius from Pexels Dear Leadership: You have the power to infuse accessibility into your design team with your next new hire. Inclusive hiring can boost your team when it is part of a greater strategy to make accessibility a priority in your design and development practices. Without a seat at the table, and space to be heard, there is no real impact for disabled UX designers. Listening to people who are disabled is a shift towards equity and a truer reflection of the world around us. Society is comprised of many different races and orientations —varying levels of ability intersect these underrepresented groups. Representation absolutely matters, especially in tech. “You cannot possibly be reaching the needs of your consumers when the makeup of your company is not reflective of the community you serve.” — Kimberly Bryant, Founder and CEO of Black Girls Code Silence can be very loud when it comes to designing for disability. I mentioned in a different post that you can draw a straight line from the lack of diversity and inclusiveness in some teams to the lack of empathy and accessibility in the designs they create. When you aren’t considering the needs of people that are different from you, the result is UX design that is exclusionary. Don’t get me wrong — I don’t believe that every company that has issues with accessibility does so willfully — however, when a lack of accessibility is brought to your attention there comes a point where you either decide to do something about it or continue with exclusion habits. Exclusion habits which Kat Holmes mentions in her book Mismatch — are perhaps the darkest dark pattern in UX. Exclusion habits enable whole teams to avoid accessibility responsibility by saying “Well I didn’t make the rules, and I can’t change them. This is just how things are done here.” “Exclusion habits stem from a belief that we can’t change aspects of society that were originally set into motion by someone other than ourselves.” — Kat Holmes, Mismatch: How Inclusion Shapes Design Teams can put a stop to exclusion habits by taking a real look at their teams and making conscious hiring decisions towards equity and inclusion. It’s pretty hard to ignore the needs of a group when someone from that group is sitting in the room and participating in the ideation process. Exclusion habits and a lack of progress related to inclusiveness harm a company’s credibility and shrinks their market reach considerably. People who are disabled are not alone, they are surrounded by friends and family who care about whether or not companies, in turn, care about their loved ones. Their brand loyalty shows where they spend their hard-earned money. There are 61 million adults in the US with disabilities, and according to the American Institutes for Research, that is a nearly 500 billion dollar market that you’re leaving on the table by not making your product accessible. Apathy and dark patterns result in a bait-and-switch experience. A team may appear to create UX design “for everyone”, but apathy and dark patterns around accessibility trick users who expect equity and present them with a mismatched experience compared to non-disabled people. People who are disabled feel that exclusion strongly when products, sites, and services block them from taking part in life. As UX designers (not just visual designers) it’s our job to uncover the user experience and learn from that. Accessibility is a facet of the user experience that we cannot ignore. We can create design that makes a difference by including people who really live these experiences in our teams and our UX design practices. We have the opportunity to stand out as companies that genuinely connect with our employees and customers by showing that we care about their needs through our behavior, not just our words. Edit: I misquoted the business market as $500 million, it is actually nearly a $500 billion market. This has been updated. (11/23/20-CL)
https://medium.com/access-bridge/the-real-unicorns-disabled-designers-bb4cbb7d3e28
['Christina Lall']
2021-01-07 18:22:07.477000+00:00
['Hiring', 'Business', 'UX', 'Diversity In Tech', 'Accessibility']
A Trip to GanpatiPule
Ganpatipule is one of the main tourist attractions of Maharashtra. It’s a small town in the City of Ratnagiri. Ganpatipule is famous for the Ganpati Temple, The Beach, and the local garden and Museum. But Let’s talk about the Famous Ganpati Temple. You can Reach Ganpatipule by Air or Train or By Bus. The nearest airport or railway station is Ratnagiri. The Ganpatipule Temple is situated on the Beach. The Temple is about 400 years old. It’s one of the most prime attractions that draws Thousands of Pilgrims Every Year to seek the Blessings of Lord Ganesh. The idol of Lord Ganesha is believed to be a monolith that is self-created and discovered over 1600 years ago. The other idol of Bappa is made of Copper which depicts the God riding on a lion. The Idol of Mooshak which is made of copper too is placed near the entrance of the temple. Ganesh Chaturthi and Maga Ganpati are the two grand festivals where millions of devotees visit the temple to seek the Lord’s Blessings. You can also visit the beach nearby. There are lots of Resorts available wherein you can book in for a day or two. Sunsets on the Beach is a Solace. Especially after this lockdown and pandemic, it would be a good idea to just hop on a plane or train and take a short vacation in Ganpatipule.
https://medium.com/@theanonymousgal/a-trip-to-ganpatipule-d1bbb5845897
['The Anonymous Gal']
2021-02-20 13:38:51.787000+00:00
['Travel', 'Travel Blo', 'Ganpatipule', 'Incredibleindia', 'Traveling']
Why Personalization Is Your Secret Marketing Weapon
When we talk about personalization in marketing, we’re not just talking about adding someone’s name to an email. Ok, yes, that does help. But it’s more about curating the user’s experience with your brand to make them feel included, considered, and truly a part of your brand’s marketing process. In the past, marketing was sort of a ‘blanket statement’ — ads said ‘Here’s our product, here’s what it does, buy it!’ But nowadays, people require a little more attention. Even though we live in a world that is largely digital and automated, people are still looking for a genuine connection with the companies that they choose to interact with. In fact, a whopping 90% of users view personalized marketing in a positive light when it comes to brands they want to buy from. And though that number might be shocking, it’s not surprising. Personalization allows users to ‘cut to the chase’ and see content that will appeal to their buying habits, demographic, etc., without all of the extra fluff that doesn’t apply to them. Getting straight to the point will make it more likely that a customer will start to build a relationship with you instead of glossing over and moving on to the next. USERS WANT TO KNOW THAT BRANDS ARE LISTENING. If you’re just getting started with your personalized marketing strategy, try starting with email campaigns, social media, or blog posts. The key to personalization in these mediums is speaking with your audience in a way that relates to them. Oftentimes in marketing, companies may send out blanket emails or social media ads advertising a product to an audience that can’t relate. Not only does this tactic hurt your brand on the outside, but it can hurt you on the inside, too — not targeting the correct audience can hurt your marketing budget and decrease your ROI. Emails in particular are the main culprits here (though social media is not free from blame!). 71% of consumers say that personalization plays a big factor in whether or not they open the email — which means not including a personalized subject line can result in a lot of unopened emails. Additionally, generic emails from companies, especially those that are unsolicited, can end up filtered out of Gmail’s primary box to the promotions folder, social folder, or worse — the dreaded spam folder never to be seen again. Personalization can also be as simple as using the right tone of voice and targeting the right audience. Consumers aren’t looking to be bombarded by digital ads and social media posts that don’t have anything to do with them — they are looking for ads that add value to their search. Defining your audience will also help you in the long run, because again, you don’t want to waste a big budget advertising to an audience who doesn’t have any use for your product or service. You’ll also want to speak in the tone of voice that your audience will relate to — and this is extremely important when it comes to social media and blog posts (and any type of marketing you’re doing, really). As much as people want to feel like they are being heard, they also want to feel like a brand is truly connecting with them. Yes, that means exactly what you think it means. Try talking to your audience, not at them. You’ll be surprised how your engagement stats change. 91% OF CONSUMERS PREFER COMPANIES THAT OFFER PERSONALIZED RECOMMENDATIONS. Speaking of stats, have you been checking yours? If you haven’t, consider starting. Data is an important part of marketing personalization. There are times where you may feel like you know your audience inside and out, but if they aren’t engaging with your content, try turning to your data to get specifics. Data and analytics are just as integral to personalization as the creative ideas are. They will help you identify which audiences you are connecting with, what types of content they are engaging with, and help you nail down the best times and what channels to use to reach out. Looking at your audience’s habits and behavior on your brand’s website and social media can help you create more of the content that matters to them, resulting in an audience that keeps coming back to your site for more information. Consumers will feel like you are considering them through your marketing, which can also result in better conversions. 91% (that’s nearly all!) of consumers actually prefer companies that provide them with relevant content and personalized recommendations. That’s an incredible amount of opportunity to miss out on. Take advantage of those analytics and seriously consider them when building your strategy. A PERSONALIZED MARKETING STRATEGY GIVES CONSUMERS A SENSE OF IDENTITY. Ok, ok — we know you’re curious about how this affects the relationship with your customer. The fact of the matter is that, yes, people know that a brand is probably in touch with hundreds of people each day. Still, no one wants to feel like a faceless person in a crowd. Using a personalized strategy helps your audience with their sense of identity — meaning that they feel as if you see them as a person instead of just a customer number. Take the time to really get to know your audience, what they need, and what they’re feeling. That includes feedback! You don’t just want to have superficial relationships with your consumers; you want to build brand loyalty. You want people to see your brand as their go-to so that they come back. Brand loyalty can bring in other customers as well; after all, good word spreads fast. But what if you want to build these relationships and you don’t know where to start? Simple — ask your audience what you want to know. It works (trust us)! Social Media Polls, Customer Surveys, and just generally asking for feedback is a great way to tap into your audience’s mind and get the information that you need. Throughout this post we’ve talked about how the majority of people prefer personalized experiences. 83% of consumers say that they would happily share information that would lead to a more personalized marketing experience. It never hurts to ask! There you have it! Personalization is probably one of the most useful tools in your marketing bag, so don’t let it fall by the wayside. Here at Creative Juice, we are avid users of personalized marketing because we believe that building great relationships are the first step to a successful marketing & branding strategy. If you are curious about how personalization can help you develop a better marketing strategy and build better relationships, contact us. We’ll be happy to help!
https://medium.com/@saiydah/why-personalization-is-your-secret-marketing-weapon-9a05a5e2ac8b
['Creative Juice']
2020-12-18 15:58:24.543000+00:00
['Advertising', 'Branding', 'Graphic Design', 'Agency', 'Marketing']
Biohacking my First Steps into Transhumanism.
the Operation. The procedure was simple. My hand was cleaned and massaged, and eventually, a needle was inserted into my finger. No anaesthetic was used. This needle created the initial incision. Eventually, a second needle was inserted, to enlarge the incision and create a pocket where the magnet would sit. This portion of the procedure was incredibly painful. The sensations got progressively worse as a magnet was injected into the large pocket that now existed within my finger. The insertion of the magnet was a particularly intense sensation. A second magnet was used to drag the inserted magnet deep into the pocket. I was able to feel the magnet ripple over the nerves inside my hand. Once this was done, the blood was cleaned off and a bandage placed on my finger. It was left to heal with the recommendation to clean it twice a day with salt water. Before bed, I removed the bandage to allow for the wound to develop a scab in the presence of my own bacteria. For the rest of the day, no sensation could be felt other than mild pain. Healing Process it Took about 2 days before I was able to Feel any magnetic Field this was my laptop charger I felt a slight Buzzing although During the First 2 weeks I was advised not to use my left hand in Close Proximity to any magnetic Object as its important that the nerves reform and Grow around the Implant. what does it do? if I enter in to an area with a high magnetic Field I’am able to detect it. if it's a microwave or a motor it's the smallest Buzzing sanitation, or if I'm close to a hard drive I feel it click and tick. and in High Power cables I’am able to Feel the small buzz surround them. Conclusion. the ability to sense magnetic fields has become greater as far as my understanding goes, as I’m now able to detect vibrations from phone speakers, microwaves, and much more. The magnet isn’t getting in the way of everyday life, although something to be aware of is that if you were to attempt to do pull-ups it would be quite difficult. The same would go for heavy lifting. Luckily my magnet crept to the left side of my finger. This has allowed for the pad of the finger to be a majority clear, meaning I’m able to lift and carry things with great ease, although when I do, the sensitivity to vibrations from the magnet is reduced. In my opinion, this is worth it, as I maintain more functions. The ability to detect magnetic fields is an incredible feeling that can only be described as a tiny vibrating sensation within the finger. Laptop chargers, motors, and other things have this field, and it has allowed me to detect where household appliances are wasting energy. For example, being able to detect and feel a magnetic field coming from my flatmate's Microwave clock about 7 inches (ca. 18 cm) away leads me to conclude that there is a large amount of waste energy being used.
https://medium.com/@owenharriman7/biohacking-my-first-steps-into-transhumanism-2f193790b4bd
['Owen Harriman']
2020-12-28 19:14:16.911000+00:00
['Biotechnology', 'Magnetic', 'Technology', 'Medicine', 'Transhumanism']
Signs Your Child Is Ready for a Toddler Bed: Transitioning Your Child From Crib To Bed
Signs Your Child Is Ready for a Toddler Bed: Transitioning Your Child From Crib To Bed I want out! That’s the message your toddler will send — one way or another — when he’s ready to wave goodbye to the crib and say hello to a big-kid bed. Your child might actually verbalize displeasure, or more likely, simply climb out of the crib. So, what needs to be done? First, resist the temptation to move him too early. Most experts recommend doings so around age 3. Unless your child is climbing out of his crib or needs more space than a crib can provide — his body is growing at an astounding rate — it’s better to keep him in the crib, which allows him to feel safe. This way, your child can feel comfortable taking giant developmental leaps during the day but still regress to the security of his old crib at night. Moreover, until age 3, toddlers are very impulsive, and your child’s difficulty in understanding and being able to follow directions or rules (like staying in bed all night) will make sleeping in a bed a real challenge. If you transition to a bed before age 3, you can plan on waking up to a little visitor next to your bed pretty much every night. When the time comes, however, you need to help your child transition smoothly to sleeping in a bed. For that, you need to follow certain steps. These are: Create a safe environment: Safety proof your child’s room and any adjacent areas he may be able to visit into the middle of the night. Secure windows, tops of stairs, and any stepstools that can be tripped over. Even better, you can install a safety gate at your child’s door. You can even install a small night-light in his room to help him orient himself and avoid hurting himself. Pick the mattress: Go to the mattress store — or any other store that sells mattresses — and let your child help you choose the mattress or bed. With safety in mind, all you need is a twin-size mattress and box spring and some safety rails for the side. You should adjust the height of this new bed accordingly, as it will need to sit low on the floor for some time until your child gets used to it. Get some fun new sheets, some special pillowcases and you’re set to go. Disassemble the crib (together): Once the new bed comes home, ask your child to help you to take down the crib. This way, your child will feel part of the transition process and will also be able to say good-bye to the crib. Set up the bed: Put the bed in a corner of your child’s room so that the head and side of the bed are flush against the wall for protection. Add a safety rail to the exposed side of the bed. Your child will feel safe this way, just as he did in his crib. Explain the rules of bedtime: If your child is verbal before the first night of sleeping in the bed, go over the rules of bedtime with him. Tell him that he is a big boy now who needs to understand that when we go to sleep, we only wake up when the sun is nice and bright. Do your bedtime routine: During the first few nights your child is sleeping in his new bed, take an extra 10 minutes of reading time together to make him feel comfortable in his new environment. The idea here is to make your child feel safe. If your child seems excited about the new bed from the very start, you’re one of those luck people who has made this transition easily. Child Psychologist Reveals Baby Sleep Secret
https://medium.com/@mss-icon/signs-your-child-is-ready-for-a-toddler-bed-transitioning-your-child-from-crib-to-bed-e3786d4490a6
['Mss Icon']
2021-03-14 06:11:39.531000+00:00
['Babysleepproblems', 'Baby', 'Baby Boomers', 'Baby Products', 'Baby Bed']
API Lifecycle and Governance in the Enterprise: Plan Stage (Part 1 of 3)
As the saying goes, “Bad governance is like bad design; it makes life harder”. With that in mind, we opt for a design-first approach. Thus, choosing the right API strategy is directly related to the ability to attract the right people, have the correct process in place, and deploy the right technology to align with the company’s strategy. Figure 1: Balancing act between people, process and technology API (Application Programming Interface) Lifecycle and Governance has the scope of an API Management platform architecture overview or a high-level design. The architecture team built it with input from the product subject matter experts (SMEs) and IT specialists for technical accuracy and product specifics. The purpose of the API Lifecycle and Governance is to: Increase the understanding of the intended API Management platform by reducing the complexity of its interactions to a set of fundamental ones. Ensure that architectural issues, such as topology and integration between the components, are dealt with consistently across the environments, and common problems have a single solution that every environment can (re)use. Communicate the chosen architecture to all participants/stakeholders. Avoid repetitive documentation of generic structures and interactions. Provide the architecture framework within which the engineers can make design/implementation decisions that do not conflict with the architecture. Governance is intended as a template for the IT specialists to follow in building the API Management environments. The focus of this blog post is on the IBM API Connect Governance and Lifecycle and NOT the entire architecture associated with the wider solution. As mentioned above, the API Management platform is wider than any mobile application; however, the immediate focus is to assure the successful launch of the API application. Therefore, the focus of this blog post will be on supporting the Runtime governance and Design time governance. Runtime governance (for APIs that are already deployed): Access control (only subscribed clients) Control who can see published API products Control who can subscribe to API products — and subscription approval Rate limiting, capacity planning, and invoicing Life-cycle management and approvals Make sure API consumers migrate to the latest version in a timely manner Suspend API consumers Off-line APIs Global runtime policies (executed for all APIs of the catalog) Design time governance: API providers should not provide an API already in existence API consumers find existing APIs Endpoint vs. environment Impact analysis Versioning Planning an API Initiative Strategy and Governance Model APIs should be intended primarily for consumption by front-end systems, either directly or indirectly, as those are the ones for which an API layer will have the most benefit. Other back-ends can consume APIs from a technical perspective, but that is not their primary benefit or role. This has several benefits: Currently separated but similar APIs can be merged and supported centrally where it makes sense to do so. API discovery can be performed, where internal (and ultimately external) innovation projects can discover new APIs. Developing APIs before them being requested by specific projects may reduce delivery time for projects needing to use them. “Loops” in API provision, where one API is ultimately used to form another, should be kept to an absolute minimum. While there may be some exceptional cases, this can make the management of the APIs significantly more complex as we introduce dependencies on the lifecycle of one API and how it affects another. In this case, it may make sense to start introducing separate API Connect installations or at least catalogs to keep these partitioned. Long-term, APIs may be exposed externally to some companies, to third parties and business partners outside the firewall. It’s not something that is under consideration in the medium-term future for some companies. We’ll address this possibility only lightly in this blog post series. Why API Governance? In the general case, API Governance exists to provide control over API design, development, deployment, and management going forward. This is partly to bring technical benefit (and hence reduced costs at all phases of the API lifecycle), but also to provide greater benefit to the business in the use of API Connect. Specifically, we are aiming to: Highlight areas for improvement in the current approach to API development and management. Ensure alignment across companies, and its partners on development, testing, production deployment, and maintenance of API Connect. Improve the quality, speed of development, and consistency of APIs going forward. Provide standards and guidelines companies and its partners (e.g., IBM) to adhere to. Once signed off, all API design, implementation, and management should adhere to the guidelines going forward. Any deviations should be agreed and documented with the company’s Technical Platform Owner for IBM API Connect. This blog post is focused more on governance and lifecycle and architecture; for other topics that deal with project management processes to make sure that APIs are built and rolled out appropriately will be elaborated on in a separate blog post. Design Context The Design Context elaborated in this section describes the high-level context in which the decisions of the Governance were made. Describing the context means describing the circumstances that form the setting for the Governance design, and in terms of which the Lifecycle can be fully understood and assessed. Thus, context is described in terms of the highest-level goals and objectives that apply to the API Management platform as it is being designed, as well as the constraints and considerations that apply. API Management Context The system context diagram shows the setting that the API Management platform will be deployed into. The diagram shown below is separated into several key areas: API Consumers : These are entities that will consume APIs that are exposed to the platform. : These are entities that will consume APIs that are exposed to the platform. Users : Various groups of users will interact with the platform, such as App Developers to subscribe to APIs and API Developers to define the exposure of APIs. : Various groups of users will interact with the platform, such as App Developers to subscribe to APIs and API Developers to define the exposure of APIs. Security : System capabilities that are required to secure the platform. : System capabilities that are required to secure the platform. Deployment and Operations : systems that support the monitoring and deployment activities for the platform. : systems that support the monitoring and deployment activities for the platform. Hardware / Network Infrastructure : The core infrastructure that the platform will be built on. : The core infrastructure that the platform will be built on. Support System : This corresponds to the peripheral systems that the platform will interact with, such as NTP and DNS. : This corresponds to the peripheral systems that the platform will interact with, such as NTP and DNS. Core Backend System: These are entities that will provide the underlying Services that are exposed on the platform. Figure 2: System Context Diagram Key Objectives and Infrastructure Principles Companies may want to create a new API Management platform suitable for exposing and managing APIs to internal, partner, and public consumers. The focus for phase I will be to support the release of the mobile application, and therefore the focus of this blog. The key objectives of this platform are as follows: Security : Provide the capability to identify and authorize users of APIs. This may include traditional user credentials and token-based solutions (such as OAuth 2). : Provide the capability to identify and authorize users of APIs. This may include traditional user credentials and token-based solutions (such as OAuth 2). Traffic Analytics : Provide insight into the API usage patterns across channels, devices, and partners. This information will be provided to the users. : Provide insight into the API usage patterns across channels, devices, and partners. This information will be provided to the users. Management & Throttling of APIs : The ability to control the access granted based on the subscribed plans. : The ability to control the access granted based on the subscribed plans. API Lifecycle management : The ability to control, using a lightweight governance model, the API exposure lifecycle. : The ability to control, using a lightweight governance model, the API exposure lifecycle. Discovery & Subscribe to APIs : Allow API consumers the ability to discover and subscribe to APIs within a catalog. : Allow API consumers the ability to discover and subscribe to APIs within a catalog. Continuous Availability : The platform must provide high availability in normal operations and be available during maintenance windows. : The platform must provide high availability in normal operations and be available during maintenance windows. Agility: The platform must support the requirements to create solutions in an agile methodology while providing adequate governance. Companies with several infrastructure principles may need to be considered during the platform design. If the design of the platform deviates from these principles, then this should be documented within the environment section. High Availability is defined as: no single instance of a software component should be able to fail and cause the platform to become unavailable. For instance, each logical component must have redundancy. Within the Production environment, high availability should be provided within the Data Center. Products should be created with an Active / Active topology to support the continuous availability objective. Dedicated (or pinned) resources should be associated with any virtualized Production environments to assure a consistent performance profile. Not all Non-Production environments need to be highly available; however, the Cert environment should be, to provide an opportunity for High Availability testing. All Production hosted components for the solution should be located within the same Data Center and visible across sites. The Production environment should NOT share hardware with Non-Production environments. The Cert Non-Production environment includes information and therefore has similar security considerations to the Production environment. Roles within API Connect The existing API Connect describes a RACI (Responsible, Accountable, Consulted, Informed) matrix for both incident management on the API Connect platform to an application and platform management. A responsibility assignment matrix, also known as RACI matrix, describes the participation by various roles in completing tasks or deliverables for a project or business process. It is especially useful in clarifying roles and responsibilities in cross-functional/departmental projects and processes. A summary of the role names relevant to this document is here for a more comprehensive list: Roles indicated above may not yet exist in the organization, but we are recommending their creation as and when the company transitions to an API Economy (internally shared and common APIs derived from business function rather than technical need). API Lifecycle This section describes the lifecycle that we recommend an API undertake through API Connect. Currently, the lifecycle that an API can take in API Connect is not fully exploited — versions of APIs are not kept distinct, APIs are not retired/deprecated, etc. IBM recommends that for clarity and ease of management as API portfolio grows that these APIs are managed according to a process like this one (this process should be owned by the Technical Architecture Board as amended over time as necessary). Figure 3: API Lifecycle Flow The API lifecycle consists of four main components related to API management which includes creating, running, managing, and securing APIs. Each one of these components is critical to the successful development, deployment, and ongoing management of APIs. The API lifecycle provides the foundation of an API strategy. Figure 4: Create. Run. Manage. Secure. An integrated solution that includes: Automated, visual, and coding options for creating APIs Automated discovery of system of records APIs Node.js and Java support for creating API implementations Integrated enterprise grade clustering, management, and security for Node.js and Java Lifecycle and governance for APIs, Products, and Plans Advanced API usage analytics Customizable, self-service developer portal for publishing APIs Policy enforcement, security, and control Enterprise Focused Comprehensive API solution: End-to-end integrated experience across API lifecycle — create, run, manage, secure, socialize, and analyze APIs through single offering on-premises, in the cloud or hybrid. Built-in assembly user experience and policies: Use a visual tool to compose API policy flows and built-in policies to secure, control, and optimize API traffic without writing custom code or logging on to the gateway Intuitive interface: Modern user experience to reduce complexity, improve performance, and allow quicker creation, management, and enforcement of APIs. Developer Focused Create and run APIs: Rapidly create APIs, connect to data sources, and expose them as REST APIs with a model-driven approach. Run Node.js and Java runtimes with unified operations and management. APIs can also be SOAP web services. First class developer experience: Enable developers to create and test APIs locally on their workstations in minutes and stage it to on-premises or cloud environments. Developer toolkit: Enable automated scripting and DevOps automation through a command-line environment for defining, managing, and deploying APIs. Conclusion In part one of three of this blog on “API Lifecycle and Governance in the Enterprise”, we discussed the importance of having a good Governance with “Planning an API Initiative Strategy and Governance Model” and “API Lifecycle” in mind. For part two, we will discuss on a comprehensive API Solution, with end-to-end integrated experience across API lifecycle — create, run, manage, secure, socialize & analyze APIs through a single offering on-premise, in the cloud or hybrid. From this point forward, we will set the stage to introduce the API Design Guidelines.
https://medium.com/ibm-garage/api-lifecycle-and-governance-in-the-enterprise-plan-stage-part-1-of-3-d350b65080a2
['Ernese Norelus']
2019-09-17 03:45:54.277000+00:00
['API', 'Api Governance', 'Api Management', 'Design Process', 'Digital Transformation']
Metrics, Metrics, Teams. How to Set-Up the Ultimate Lead Generation Machine — Part 2
A comprehensive guide to help you take actions and optimize your Customer Acquisition Cost (CAC) [This article is part of a group of 4 dedicated to provide an insightful guide to online lead generation: “How to Set-Up the Ultimate Lead Generation Machine”. Part 1: CAC, Sales Volume & Business Optimum; Part 2: Metrics, Metrics, Teams; Part 3: Log, Build, Automate; Part 4: Arbitrage Tips & Tools] In Part 1, we explored the two main drivers of all lead-fueled business: CAC and Sales Volume, and discussed how to set your business targets to find your Gross Margin Optimum. Even if very detailed, the first article was a helicopter view of the lead generation problem, barely touching the surface of concrete, actionable insights for marketers, sales directors and other decision makers. In this second article, I intend to get much deeper into each step of the lead generation funnel, detail who is responsible for what within a regular organization, and introduce the main KPIs you should focus on. If there is one thing I’m 100% sure of in this industry, it is that the three factors of success are Metrics, Metrics and Metrics. I mean it, lead generation is all about key metrics! Metrics, Metrics, Metrics. Lead generation is all about Key Metrics. Here is what I will cover today: The breakdown of the lead generation user experience funnel and the naming and conventions used by professionals to describe each of its steps All lead generation key metrics you should know about (and how to compute them) How to organize a high performing lead generation team and which KPI each team should focus on. Are you up to it? I am, let’s get down to it. 1. The 8 Steps of the User Experience Funnel Let’s start from the start. The lead generation process is a funnel made of different steps along the consumer online buyer journey. From the ad displayed, viewed and eventually clicked by the prospect, to the final consumption of the product or service. The art of lead generation relies on how you can actually help people with a need, a question or a project. Like my grandma looking to remodel her sunroom for instance. Guide her through the different layers of the funnel as fast, safe and effectively as it can possibly be in order to secure a good deal aligned with what she is looking for in the first place. That is why you need to build a comprehensive understanding of your own funnel, using key metrics that are widely used among the Industry. Looking more closely into it, most user experience funnels are made of 8 steps: I think the slide seen above is quite self-explanatory. The online buyer journey starts in a Channel (Google Search, Facebook, Linkedin, Instagram, the email inbox or an online newspaper for instance). In this channel, the user browses content and is frequently exposed to online advertising. This is how Views (Step 1) are generated. If the ad audience targeting, copy writing, and imaging are all well executed, then the user will eventually click on it. Congrats, you just generated a Click (Step 2)! Keep in mind that for many technical reasons, all clicks do not end up becoming Visitors to your website. Online marketers usually see a discrepancy of roughly 5 to 10% between the number of clicks channels claim to have generated (i.e. that you have to pay for) versus the number of visitors your website analytics provider will count for you. When landing on your website, users will soon enough be exposed to: Free Contents to help them self-qualify their intent and leave the page if not interested to help them self-qualify their intent and leave the page if not interested Calls To Action (CTA) like Newsletter subscription boxes, Lead Magnets (usually a piece of digital, downloadable content, such as a free PDF, report, eBook, white-paper, video, etc.), Forms and Toll-free Telephone Numbers. When users find what they are looking for on the website, and willingly share their contact information to start engaging with a brand, they become Conversions. However, not all conversions are equal. Some are purely useless. Others are not ready to be sent to the sales team. Finally some are good, worth working on, what the industry calls Marketing Qualified Leads (MQL). What is the difference between Conversions (aka Captured Leads) and Marketing Qualified Leads (MQL) anyway? Obviously fake conversions: wrong phone number, email address, zip code. Silly personal data (name = Mickey, last name = Mouse), irrelevant answers to mandatory open field questions (“blablabla”). These leads are called Scrub Leads. wrong phone number, email address, zip code. Silly personal data (name = Mickey, last name = Mouse), irrelevant answers to mandatory open field questions (“blablabla”). These leads are called Duplicates: leads you already have in your database. leads you already have in your database. Conversions with mandatory fields missing: theoretically this should never happen if you use automated fields validators, and if they are implemented properly. But you know, errors happen everyday in the technical world and sometimes you end up logging conversions that miss some crucial information, preventing them from being considered MQL. These are Scrub Leads too. theoretically this should never happen if you use automated fields validators, and if they are implemented properly. But you know, errors happen everyday in the technical world and sometimes you end up logging conversions that miss some crucial information, preventing them from being considered MQL. These are Scrub Leads too. Conversions without Compliance: users browsing with an Adblock sometimes prevent the landing page from logging proper user consent (TCPA / GDPR language). This usually happens when you use a third party compliance service. From a purely legal standpoint, you don’t want to take the risk of calling back these leads. users browsing with an Adblock sometimes prevent the landing page from logging proper user consent (TCPA / GDPR language). This usually happens when you use a third party compliance service. From a purely legal standpoint, you don’t want to take the risk of calling back these leads. Conversions that need to mature: some conversions need more time to mature through the lead nurturing journey before being put in contact with the Brand’s salesforce. For example a white-paper download is a Conversion, but the intent is usually very low: at this point the user is probably not ready to buy anything from the brand. You will have to send a couple (automated) emails to understand A) if this is an interesting lead for you and B) if the user is now ok to start the sales process. If the answer to these two questions is yes, then you can consider it a MQL. Disclaimer: leads needing to mature are very e-commerce centric. Most telesales-based brands skip this step, and try to connect as quickly as they can when they get access to the user phone number. They consider that the intent at this point is consistent enough to give it a try. Now, the sales team has just received a MQL to start working on. Obviously the first thing to do is to reach out and contact the consumer, usually by phone. If someone picks up the phone, then most professional outbound call softwares will consider the call a success and the MQL a Contacted Lead. If the user is not reachable after a given number of attempts, then the sales process will stop and the lead will stay a MQL. You can keep in mind that the Contact Rate depends both on the quality of the Intent (i.e. MQL quality) and the salesforce call back process (mostly the time duration between the moment a lead is flagged as MQL in your system and the moment the lead receives a call). From Contacted to Sales Qualified Leads (SQL): once your sales representative has the prospect over the phone, he/she will try to quickly assess if there is a real business opportunity lying behind the inquiry. To do so he/she will have to answer three questions: Does the contact information match with the person over the phone? (obvious, but critical) Does the person have a real need / question regarding the type of product / service offered by the brand? Is the person ready to consider buying such a product / service Yes, Yes, Yes? Then this contacted lead becomes a SQL! SQL are probably what you had in mind when you started reading this article. What your team is highly focused on. And yes! SQL are a cornerstones to lead generation optimization. Last but not least, some SQL will finally be transformed into sales, and become customers. Why then split the “Sales” step into Gross Sale & Net Sale? There is nothing mandatory here, but this split is usually very useful when tied to business operations. Let me give you two examples: education brands differentiate Enroll (deposit) vs Start. Home improvement brands like home security providers or bathroom contractors differentiate Sale (deposit) vs Installed (full payment). So, as you can easily imagine, there is another drop-off between Gross and Net Sales, that you want to monitor and optimize. Now that we have divided the buyer experience into mutually exclusive steps (meaning that a prospect cannot stand on more than one step at a time), with clear naming, we can start computing key metrics and measure drop off between steps to understand where to start the optimization work! 2. The Lead Generation Funnel Key Metrics (KPIs) Each step of the funnel can be measured in Volume, Rate and Cost. In this section, I will present the main metric for each step, and how they interact to build the metric you need. There are no successful companies, only successful teams. Keep in mind that all key metrics are interlinked and that combined, they will lead you to the next stage of the journey, and eventually to your final destination: a sale. This is why I used the multiplication symbol between all of them on the slide seen above. I believe that there are no successful companies, only successful teams. In a typical lead generation organization, each team will take care of running, monitoring, and optimizing a few parts of the funnel: A) The Traffic Team Dolead’s Traffic Team working from our EMEA HQ in Paris (Library Room) The Marketing Team, more precisely the Online Traffic Team, is in charge of generating high volumes of high converting clicks on the landing page, at the lowest average cost per click possible. We already explained here why generating high clicks volume with low CPC is a very challenging target. Auction-based channels won’t let you be successful without a strong determination, extensive testing, and a good set of hacks. The traffic team will focus first on Reach, meaning generating Hits, the number of times ads are displayed. Views (Reach): Volume = #Hits (number of impressions) Rate = %Impression Share (the percentage of total possible impressions you captured) Cost = Not to be computed as most channels are CPC-based. Still, if you buy display, video or email traffic, you may have to pay per impression (Cost Per Thousand, Cost Per Mille, Cost per View for video advertising) Then, the team will have to answer the following questions: is the audience qualified enough? A good performing audience for a given campaign? Is the ad displayed well performing? Clear enough? Attractive enough? Visible enough? Clicks: Volume = #Clicks (number of clicks) Rate = %CTR (Click Through Rate) Cost = $CPC (Cost Per Click) Clicks = Reach x CTR Cost (Marketing Budget) = Reach x CTR x CPC If you want to take a closer look into today’s main CPC-based channel and its Auction Model, you can watch the video by Google Adwords’ Chief Economist Hal Varian, explaining how CPC, Bids and CTR are interlinked. B) The Website Team (Marketing Operations / Tech Team) There are two ways to create a well performing website. One is to ask your internal or outsourced Tech Team build and optimize it. Another one is to let the Marketing Team leverage an external landing page and A/B Testing software, like ABtasty, Unbounce or Instapage. The second option is much more agile, but less customizable. Dolead’s Tech Team working from our EMEA HQ in Paris The goal here is to convert clicks into actionable MQLs you can transfer to the Sales Team: Conversions: Volume = #Conversions (number of raw conversions logged into your system) Rate = %CVR (Conversion Rate) Cost = $CPA (Cost Per Acquisition, Cost Per Conversion) Conversions = Reach x CTR x CVR CPA = CPC / CVR Now, you need to convert conversions into Marketing Qualified Leads. Remember, some conversions will turn into scrub leads. Here is where you usually need the Tech Team to set-up rules to automatically exclude fake/bad leads. Marketing Qualified Leads (MQL): Volume = #MQL (number of “good” leads captured) Rate = %MQL (MQL Rate) Cost = $PPL (Price Per Lead = Cost Per Lead = Cost Per MQL) MQL = Reach x CTR x CVR x MQL Rate PPL = CPA / MQL Rate The PPL is usually the Price you pay to an external lead vendor, when you decide to outsource part of your lead generation efforts. C) The Sales Team Last in line, first to reach out to new potential customers. The Sales Team carries a heavy burden on its shoulders. Now that the marketing budget has been deployed and the leads have been qualified by the Marketing and Tech Teams, they have to convert all these efforts into sales, into revenue and into return on investment. Dolead’s Sales Team working from our EMEA HQ in Paris (Canopy Meeting Room) The Sales Team follows a three steps process: Contact, Qualify, Sell. Contacted Volume = #Contacted (number of contacted MQLs) Rate = %Contacted (Contact Rate) Cost = You don’t really need this metric. But you can compute it if need be (Price/Cost Per Contacted) Sales Qualified Leads (SQL) Volume = #SQL (number of Sales Qualified Leads) Rate = %SQL (SQL Rate, Approval Rate) Cost = $PPSQL (Price Per SQL, Per Approval) Sale (Gross & Net) Volume = #Sales (number of Sales) Rate = %SR (Sales Rate) Cost = $CAC (Customer Acquisition Cost) 3. When the Dust Settles: What Matters Most You need 3 teams to run and optimize your lead generation funnels. Traffic, Marketing Operations and Sales. These teams have very different skillsets and know-how. To simplify, if you had to give clear guidelines to each of your teams, you could focus on 3 main metrics: Cost Per Click (CPC), landing page Conversion Rate (CVR) and Sales Rate (SR). Teams are everything within an organization. Because all KPIs are interlinked, you need to make sure your teams are also interlinked, and learn to work together as one, even if they focus on different targets. Organizing your teams that way will help you better serve your prospects, optimize your lead generation efforts, and monitor drop-offs between every key metrics. At the end of the journey, this will also allow you to optimize your CAC: as you can see on the slide above, there is another way to compute CAC leveraging each team’s KPI. CAC = CPC / (CVR x SR) Conclusion After reading all this, I do hope that you have a comprehensive understanding on how lead generation works, you know all the metrics you need to monitor to start optimizing your lead flow, and you have a better vision of your ideal team organization. Now, I hear you thinking “Ok this looks very powerful. But moving forward, can I build my own ultimate lead generation machine?” The answer is Yes you can! And without a single line of code. We will cover this in my next article. [This article is part of a group of 4 dedicated to provide an insightful guide to online lead generation: “How to Set-Up the Ultimate Lead Generation Machine”. Part 1: CAC, Sales Volume & Business Optimum; Part 2: Metrics, Metrics, Teams; Part 3: Log, Build, Automate; Part 4: Arbitrage Tips & Tools]
https://medium.com/online-marketing-and-entrepreneurship/metrics-metrics-metrics-how-to-set-up-the-ultimate-lead-generation-machine-part-2-3b92514034db
['Arthur Saint-Père']
2020-05-07 14:45:12.287000+00:00
['Entrepreneurship', 'Lead Generation', 'Data Science', 'Sales', 'Online Marketing']
Alien Worlds Metaversal Survey
Alien Worlds Metaversal Survey Metaversal Operations, the Federation and fellow community members want to know what you think! Newly added to the Explorers’ Station are Metaversal Surveys. Surveys will be posted frequently. Find the Metaversal surveys here. During the first week, Players who complete a survey while it is still live will be chosen at random to win an NFT Game Card, or in some cases, the highly desired Alien Worlds NFT Game Card pack (one each). Recent surveys include: Thank you for your feedback! Portal to Alien Worlds Mining Using your WAX Cloud Wallet, login at http://play.alienworlds.io/ Read Recent Medium Articles Visit the Explorers’ Station Alien Worlds is the #1 Blockchain Game! With more than 3,333,681 player accounts, and more than 273,038 active daily, Alien Worlds has quickly climbed the DappRadar ranking charts. Alien Worlds has become #1 Blockchain Game. The Federation would like to thank all Explorers for making this possible. About the Alien Worlds Social Metaverse What would you do, if you could create anything in the world? Join the Alien Worlds simulation of Earth’s economy using the Trilium (TLM) game token. Seek your fortune and thrive in the Trilium and NFT Social Metaverse. Get started by going through the Wormhole at alienworlds.io and start by Mining at https://play.alienworlds.io/ Welcome to Alien Worlds! #create #thrive #Metaverse #NFTs #TLM
https://medium.com/@alienworlds/alien-worlds-metaversal-survey-81a9af8c8a3d
['Alien Worlds']
2021-09-09 14:25:15.588000+00:00
['Nft', 'Surveys', 'Ethereum', 'Bitcoin', 'Game']
Baby Spoon & Fork Training Set (2pk) Fun Giraffe Design | Soft Silicone BPA Free | Infant & Toddler Utensils for Mealtime and Self Feeding (Iron & Blue)
Whether you’re using the ezpz happy mat or munchkin suction bowls we know how difficult it is weaning your little one off of mom-assisted feeding time. That’s why we created our Ali+Oli Spoon & Fork training set. Your little one will love our oval fork design that makes picking up food easy, and our uniquely contoured spoon that helps scooping up softer foods a breeze. Our infant spoons and forks encourage self feeding and are a great addition to your silicone toddler mealtime plates. We have made a soft ergonomic grip that encourages your little one to hold the spoon properly. Our adorable giraffe design brings joy to feeding time and kids love it. We’ve created some wonderful color combinations to go with almost any plates for babys. So give them a try and watch how quickly your little one picks up on self feeding.🍓 GREAT FOR YOUR LITTLE ONE to build confidence when learning about self feeding. 🍋 UNIQUE OVAL FORK DESIGN makes picking up solid foods fun & easy for your baby. 🍊 AN EASY ERGONOMIC GRIP utensil for babies that encourages proper hand placement while feeding 🍉 A PERFECT SPOON ANGLE to help your baby learn hand-to-mouth coordination while eating 🍇 WITH A LIFETIME GUARANTEE no questions asked. We stand behind all of our baby products Price: Source Link: Baby Spoon & Fork Training Set (2pk) Fun Giraffe Design | Soft Silicone BPA Free | Infant & Toddler Utensils for Mealtime and Self Feeding (Iron & Blue)
https://medium.com/@ivaper/baby-spoon-fork-training-set-2pk-fun-giraffe-design-soft-silicone-bpa-free-infant-toddler-193186da2830
['Best Bacare', 'Baproduct Top World']
2019-10-22 07:11:55.198000+00:00
['Blue', 'Baby', '2pk']
How do you know if you are working on the most valuable item?
We create value in software development by building the right thing, building it well, at the right time. The “right thing” implies identifying the most valuable work items, “building it well” covers the quality of what we produce, and “the right time” means getting it in the hands of our customer at the right time. We need to wrestle this equation like a skipper wrestles the boat in turbulent waters. As the waves batter our delivery boat, the line between knowing what to work on next, balancing quality and knowing when to start becomes increasingly indistinguishable. As delivery leads, we need to deal with boat breaking conditions, needing to perfectly balance the functions of timing, priority and long term sustainability of the system. So, how can we navigate these choppy waters? Let’s model this process working backwards, starting with the result — the delivered value. The delivered value To deliver this value, we have a specific build capacity (e.g. long-standing project or product teams). This capacity will fluctuate, but for simplicity, let’s assume for now, that it is fixed and stable. Delivering the value using a build capacity Different types of work types put the demand for this capacity: features (product enhancements) — investments driven by a hypothesis of user needs regulatory requirements — needs that are imposed by industry regulatory bodies cost-saving needs — operational cost reductions technical improvements — technical investments of the platforms and products used to satisfy all of the above And these items are identified by multiple sources, some closer to the team, some very distant: product management business outcome leads user feedback stakeholders regulatory institutions delivery team itself We can shape this demand as a funnel, which will look something like this: And here lies the first of the challenges we face — the will always be more work that needs doing than our available capacity. Demand outstripping the building capacity The need for prioritisation Given that the demand outstrips the capacity, we need to develop a prioritisation mechanism. But how do we do it? The idea of being able to prioritise seems a beautiful activity, a way for creating order to a volatile situation. The sea of options on what to work is vast, and finding the optional solution for picking the right item to build is compounded by factors such as: items have a different value the solution space spans from problems well understood, to more complicated but where expertise or analysis helps, to unknown or even unknowable (*) some item can have expiration dates (think of seasonal features that customers want in time for Halloween, or Christmas) From the collection of option, we need to find the right item to work on From the collection of possible options, we need to find the right items, find those nuggets of value, at the right time. “The problem with any prioritization decision is [it is] a decision to service one job and delay another.” — Don Reinertsen We are continually trading cycle time for other things of value, and we are facing tough decisions — leave a feature out and release earlier or wait to build that feature and release later. We have two options — say “not now” or expand the time allocated to build Budgeting — slicing the capacity By now, we established that we need to tackle the challenging problem of prioritising our work, focusing on priority and timing. But still, how do we do this? How do we work out the priority and timing? Given that not all work is the same (different value, different solution space, various expiration dates, etc.) we cannot apply a one size fits all to prioritisation. Many aspects are outside our control, such as the value of an item (we can attempt to calculate it, but in reality, we don’t know how much our customers will value a particular feature). Or, our current levels (emphasis on current) of understanding of the problems space (of domain and technology) — we only know what know now, we cannot magic in an instance more knowledge. What is in our control is the way we can allocate our capacity. Let’s look at the capacity bit in more detail. Delivering the value using a build capacity In reality, it looks more like sketched below. If plotted against the time, the capacity fluctuates. Holidays, attrition, hiring, rate of interruptions caused by multi-factors (often caused by poor management) contribute to capacity fluctuations. Capacity fluctuates in time What we can do with this capacity is to slice it into budgets, and allocate one slice for a different type of work. The number of slices and their allotted percentages depends on the context, and of course, in itself, it can be challenging to achieve, but here is a heuristic that we can apply: allocate a percentage to “just do”, no regrets type of work that is obvious that needs doing, universally agreed by everyone as must do soon allocate a percentage to long term investments and, allocate a portion for regular development By applying this approach, we can reduce the problem space and subsequently reduce the analysis time. Capacity allocated in classes of problem It is tempting to create sophisticated models to solve the budget allocations, and we need to be wary of not introducing significant errors hidden by the apparent sophistication of such models. We can start by establishing the budgets at a macro-level, agreeing on the types of slices, the capacity allocated to each portion and the review mechanism of this allocation. Last but not least, it is essential to decide on qualitative measures that we can use in the review process — expected signs of success or possible signs of failure that we can foresee at the get-go. How to prioritise within budgets? Once we established the percentages for each budget, the ordering of items within the slices might still be a problem. However, given that we classified the issues at a higher level, and we split them into categories, we now can apply different prioritisation solutions for each of these categories. The order of items from the “just do” slice should be hopefully self-evident (otherwise they would not be fit for this slice). For the long terms investments, given their nature, that are long term, the need to prioritise at a more granular level should be reduced. For instance, if we want to improve our ability to release code faster, we will need to break the problem into smaller chunks and keep at it until we achieved our desired outcome. The order on which we tackle these chunks mightn’t be that important. For the regular development slice, the “golden” standard is calculating the cost of delay, which is the opportunity cost of not doing something. Expressed as a rate, money per unit of time (e.g. £/month), it represents the foregone revenue or foregone cost-saving. If we can calculate the actual cost of delay, then it is fantastic news. We then should use this value as an input into prioritisation. Also, if the duration is available as well (with decent confidence levels), then a Cost of Delay/Duration (also known as CD3) will provide a shortest weighted job mechanism. Calculating a shortest weighted job mechanism needs further exploration — I believe that a forecasting approach is better suited (given that both the cost of delay and duration is often a range), something to explore on an explicit blog post on the topic. If we cannot calculate these values in a reasonable amount of time with reasonable confidence levels, then there is still hope. We can split the items into different categories that model the profiles of cost of delay. Cost of delay profiles. Note, the long term investment profile was intentionally omitted, given that I propose that long term investments should always take place in the budget of a team. By classifying the items in one of these profiles, the answers to prioritisation should become more self-evident. We can now at least see what these items contribute to and ultimately, how they can contribute to our overall objectives. Selection should be now, while still complicated, simplified. Visualising the Regular Development slice using the Cost of Delay profiles Conclusions Same as a boat is battered by powerful waves, fighting for the attention of the skipper for course correction, our delivery machine fights for our attention to decide on what to work. Knowing what to build in what order is one of the most challenging and important problems to get it right in delivery. It is the equivalent of avoiding the mistake of pointing the boat in the wrong direction. Or by not building a strong enough boat for the journey ahead. Sequencing the work by taking into account the priority and timing are wholly grey. To achieve it we need to decompose the granularity of the problem — budgeting and cost of delay profiles are one of the means to achieve it. As always, though, looking one-dimensionally to solve a problem is not enough. Strategy, road mapping techniques, complexity, and outcomes-based development are all influencing this solution space. But that is for another write-up. Fair winds and following seas.
https://peterpito.medium.com/how-do-you-know-if-you-are-working-on-the-most-valuable-item-422fdecc27aa
['Peter Pito']
2020-11-30 11:56:57.937000+00:00
['Software Development', 'Prioritization', 'Cost Of Delay', 'Sequencing', 'Agile']
AYS Daily Digest 23/12/20: Fire and evictions at the Bosnian camp of Lipa
AYS Daily Digest 23/12/20: Fire and evictions at the Bosnian camp of Lipa Ocean Viking back at sea / 2 young people die in Italy / Field reports from Calais / Returned to Afghanistan - ruins, insecurity, and hardship / Solidarity grassroots initiatives, and more… Are You Syrious? Follow Dec 24, 2020 · 13 min read Lipa fire — photo credit Klix and SOS Balkanroute FEATURED — Eviction and a fire breaks out at the Lipa refugee camp The camp was due to close as it hosted people in inhumane and squalid conditions. However, the eviction has not been coordinated with the appropriate re-location of the people who lived at Lipa, leaving 1000+ people homeless, without any provisional solution from the international organisations who have been given the mandate to run camps across the country. LIPA CAMP FIRE: A CHRONOLOGY OF THE CATASTROPHE — see video Following months of sub-human conditions at the Lipa camp in the Una-Sana Canton in Bosnia and Herzegovina, near the Croatian border, things took a foreseeably ugly turn. Here is the hour-to-hour account of everything that went wrong last night and today, resulting in over a thousand new homeless migrants in the area. Yesterday around 4 pm, IOM staff tried to move 46 people in three vans from Lipa camp to the old, hard-structure Bira camp in the nearby city of Bihać that was cleared out a few months ago following pressure from local authorities. When the IOM vans arrived at the gate of Bira ex-factory-turned-camp in Bihać, a group of 50 locals and police officers intercepted them, blocking the entrance to Bira, so the people had to be returned to the Lipa. Thirty minutes later, the municipality sent a firefighter brigade that helped block the gate of the camp. Following that incident, last night the IOM started informing their partners and other organisations that the long-awaited closure of Lipa camp is set for today in the morning. A late evening meeting was held with the key stakeholders to organise the evacuation. This morning around 8 am, IOM started taking people out of tents and asking them to form lines inside the camp. Some organisations were not allowed in the camp. It seems DRC staff were present inside, but they evacuated together with IOM after the fire broke out around 11 am. It is interesting to note that in Bihać, where Bira is located, at around 10 am people were already saying that Lipa would soon be burnt to the ground, and there were a lot of conflicting rumours about who might be responsible for it. For example, anti-migrants groups on Facebook alleged some people working at the camp in official capacity might have something to do with the fire, in an attempt to put pressure on Bihać authorities to open a local radio station Bira camp. On the other hand, many locals are convinced that the migrants burnt down the camp themselves, in hopes of being transferred to Bira. The head of IOM in Bosnia published on his Twitter account that a group of former Lipa residents set the camp on fire after most of the people had already been evacuated. When the fire broke out at 11 am, police surrounded the camp and tried to remove the people from the perimeter, but at the same time they didn’t allow them to walk towards the city of Bihać. After an hour, special police also arrived at the site. Following the news of the fire, today at noon, there was a protest of locals in front of Bira camp, opposing the reinstalment of the camp inside the old factory, once again with the presence of the fire brigade that was blocking the entrance. At 1 pm, the police and firefighters blocked the streets towards the city, obstructing the passage for 300 people who were trying to walk to Bihać. In the afternoon, demoralized people started putting up small makeshift tents in front of Lipa, after it had been completely burnt to the ground, with no food, water or any basic facilities. The central Bosnian government is now putting the pressure on cantonal and city authorities to open Bira camp, but the mayor of Bihać is refusing to, in response to pressure from the citizens who don’t want the camp in the city anymore. As a result of this ping-pong, over a thousand residents of Lipa camp are left to sleep rough, with almost no support and no shelter to protect them from the winter temperatures. This humanitarian catastrophe might be stopped by reopening Bira camp, but there seems to be no political will to do so. We’ll keep reporting from the ground. Lipa camp was to be closed on Wednesday and moved to another location, but officials said that since its tents and other facilities were almost entirely destroyed in the fire it cannot immediately be moved to another spot. It is good to remind everyone that the fact the Lipa tents were not suitable to be called accommodation or to be considered a permanent solution was obvious and known since the begining. However, it was also obvious that a strong dialogue, lobbying, planning and a lot of work was going to be needed in order to gain trust and support from the political side in order to be able to lodge everyone who needed accommodation. Not much was done on that part from the UN agencies, apart from last minute (last months) public warnings, but with their vast experience, surely they knew that occasional statements in the media would not bring about the solution. The UN also called again on the Bosnian authorities to make available new locations, preferably outside the Una, Sana and Sarajevo Cantons, for people stranded outdoors in temperatures below freezing. The United Nations repeated that capacities and funds for solutions were available. However, a solution will not be easy to find. The authorities of Bosnia’s other entity, Republika Srpska, have said they do not want any camps on RS territory, while political leaders on territory where Bosniaks predominate share the same feeling. For more videos documenting the fire, see here and here
https://medium.com/are-you-syrious/ays-daily-digest-23-12-2020-7219e359af5a
['Are You Syrious']
2020-12-24 14:58:05.980000+00:00
['France', 'Bosnia', 'Digest', 'Greece', 'Frontex']
GoldStoneV1.2.0 released supports BTC, easily managing all chain assets by a mnemonic word
GoldStoneV1.2.0 released supports BTC, easily managing all chain assets by a mnemonic word On August 7, GoldStoneV1.2.0 version was released. Let’s see what surprises the new version brings to us. Three major optimizations: ▪️Increased support for BTC, and has supported BTC, ETC, ETH and management of all ERC20 token assets; ▪️Multi-chain asset management fully follow the BIP44 guidelines, a mnemonic word can manage the assets of each chain, and is a true multi-chain purse. ▪️Add the market support for the FCoin exchange, at present it can support adding market quotation of Binance, FCoin, Huobi, Okex, Bitfinex, etc. Two experience optimizations: ▪️Now support the creation and management of a subsidiary account in a wallet account, making it easier for you to store your assets in different addresses; thus, it has more privacy. ▪️The multiple wallets switching function is moved in “settings-wallet management”, and adopts a beautiful card format, which can display more detailed wallet information, the total asset value of each wallet is at a glance, with more beautiful and friendlier interface. Detailed introduction of key optimization functions 1. Goldstone becomes a wallet, which can hold BTC and ETC, ETH and all ERC20 tokens. The Goldstone Wallet supports all the standard ERC20 Tokens on ethernet mainnet, and we conservatively estimate that Goldstone has supported more than 80,000 Tokens. The next version will support BCH and LTC, followed by the EOS common chain. 2. Goldstone technology development follows the common standards BIP32, BIP39,BIP44 and ERC681 in Blockchain, which means that all wallets which are compatible this specification can be smoothly imported and exported on Goldstone. 3. Engineers have conquered technical difficulties of Bitcoin and Ethereum’s, and achieve the use of a mnemonic word to manage multi-chain assets, as well as balance relation between security and product experience. 4.We have implemented the support for the exchange market of FCoin, Huobi, Okex, Bitfinex, Upbit and others. Users click to enter the “market quotation”, search tokens which they want to know, and then the real-time market quotation of the above exchanges can be viewed. About Goldstone download 1.We recommend that Android users take the following download methods: Download through the Goldstone official website Download from Google Play store— Android App 2.iOS version will be online, please look forward to it. Contact Us Contact us in the following ways if you have questions to feedback: Website:https://www.goldstone.io/ Twitter:https://twitter.com/goldstoneio Facebook:https://www.facebook.com/GoldStoneio Medium:https://medium.com/@goldstoneio
https://medium.com/goldstoneio/goldstone-v1-2-0-released-support-for-btc-a-mnemonic-chain-wallet-83a67814fc62
[]
2018-09-04 11:26:24.487000+00:00
['Blockchain', 'Bitcoin', 'Erc20', 'Ethereum']
Letting Go of Diagnoses Improved My Mental Health
Letting Go of Diagnoses Improved My Mental Health And it just might work for you, too. 2017 was a really bad year for me. I ringed in the New Year in a psychiatric facility, and though my time there was brief, it led to a circus that included a new psychiatrist and a whole bunch of new medication. Along with that circus came a bunch of diagnoses I hadn’t ever heard of before, which included: Borderline Personality Disorder (BPD) Bipolar Disorder Seasonal Affective Disorder (SAD) Post-Traumatic Stress Disorder (PTSD) For years, I was treated with Lithium, Seroquel, Zyprexa, Xanax, Ativan, Klonopin, Lamictal, Abilify, Doxapine, Trazodone, Latuda, and others. At any given time, I was on four to six different antipsychotic, antidepressant, mood-stabilizing, and anxiety-reducing medications at once. For a few brief years, I was a shell of myself hiding behind the cocktail I was prescribed. Sadly, I let this go on for so long because I was allowing my diagnoses to define me. Behind every thought, word, and action that comprised my being, there were my mental illnesses. In any given day, I had a panic attack (PTSD), I dissociated frequently (BPD, PTSD), had some kind of a tantrum due to my unstable sense of self (BPD), and had an intense mood swing that brought me back up on my feet and feeling like I was on top of the world (Bipolar Disorder). After three and a half painstaking years, I found myself with a new psychiatrist whose prescriptions were minimally invasive and whose thoughts on my prior diagnoses were a bit different than that of what I was used to. She firmly believed that my BPD would improve over time and assured me that my condition should be first treated by therapy as the main course of action — she assured me that medication should be a boost and not the go-to solution for my conditions. I found a great therapist due to her, and with my new lifestyle as a person who was no longer over-medicated, I began to slowly but surely thrive. Even my new therapist barely mentioned the names of my disorders to me when we were combing through the tangled messes that my maladies often left me with. She didn’t think of me as a diseased or disabled person — she just thought of me as a person. Being treated so differently than I had been from the team of doctors linked to the hospital was utterly refreshing and revitalizing. Now, after years of feeling like a zombie, I was clear again. I still relied on medication to help prevent me from tipping into psychosis and other problematic states, and especially relied on it to help me through my night terrors and other sleep problems, but now I wasn’t over-medicated. Slowly but surely, I began to see myself as more than my diagnoses and to feel a greater sense of command over my body now that I wasn’t numb from the consequences of being treated like a sickly person. The search for mental wellness is not always easy. I know this firsthand. Now that I have a better team of mental health professionals around me, I feel a sense of support that goes beyond medication. I feel that I suffered for so many years for a good reason and that the suffering I felt at the beginning of my wellness journey was very important in being able to lead a happy life now. Is my life perfect now that I’ve stopped over analyzing my diagnoses? No. In fact, some of those very words — like BPD — saved my life. Without my Borderline diagnosis, I would have never discovered Dialectical Behavioral Therapy (DBT), perhaps the greatest tool I have for everything from emotional regulation to distress tolerance. Diagnoses are important and do not need to be overanalyzed to the point of becoming a limitation. They tell professionals how to get us the best help possible based on decades of research. If you or someone you know is struggling with a diagnosis, you are not alone. There is a way of acknowledging an illness without allowing it to consume us. I believe that over time, finding wellness becomes easier and we must learn graciously from the trial and error that comes with medication, therapy, and other tools to live our best lives.
https://medium.com/invisible-illness/letting-go-of-diagnoses-improved-my-mental-health-d1a04756c600
['H. M. Johnson']
2020-12-29 18:15:19.624000+00:00
['Mental Health', 'Wellness', 'Mental Illness', 'Medication', 'Psychology']
AI and the Law: Setting the Stage
While there is reasonable hope that superhuman killer robots won’t catch us anytime soon, narrower types of AI-based technologies have started changing our daily lives: AI applications are rolled out at an accelerated pace in schools, homes, and hospitals, with digital leaders such as high tech, telecom, and financial services among the early adopters. AI promises enormous benefits for the social good and can improve human well-being, safety, and productivity, as anecdotal evidence suggests. But it also poses significant risks for workers, developers, firms, and governments alike, and we as a society are only beginning to understand the ethical, legal, and regulatory challenges associated with AI, as well as develop appropriate governance models and responses. The Revolution by Fonytas, licensed under the Creative Commons Attribution-Share Alike 4.0 International license. Having the privilege to contribute to some of the conversations and initiatives in this thematic context, I plan to share a series of observations, reflections, and points of view over the course of the summer with a focus on the governance of AI. In this opening post, I share some initial thoughts regarding the role of law in the age of AI. Guiding themes and questions I hope to explore, here and over time, include the following: What can we expect from the legal system as we deal with both the risks and benefits of AI-based applications? How can (and should) the law approach the multi-faceted AI phenomenon? How can we prioritize among the many emerging legal and regulatory issues, and what tools are available in the toolbox of lawmakers and regulators? How might the law deal with the (potentially distributed) nature of AI applications? More fundamentally, what is the relevance of a law vis-à-vis a powerful technology such as AI? What can we learn from past cycles of technological innovation as we approach these questions? How does law interact with other forms of governance? How important is the role of law in a time where AI starts to embrace the law itself? How can we build a learning legal system and measure progress over time? I hope this Medium series serves as a starting point for a lively debate across disciplines, boundaries, and geographies. To be sure, what I am going to share in these articles is very much in beta and subject to revision and new insight, and I’m looking forward to hearing and learning from all of you. Let’s begin with some initial observations. Lawmakers and regulators need to look at AI not as a homogenous technology, but a set of techniques and methods that will be deployed in specific and increasingly diversified applications. There is currently no generally agreed-upon definition of AI. What is important to understand from a technical perspective is that AI is not a single, homogenous technology, but a rich set of subdisciplines, methods, and tools that bring together areas such as speech recognition, computer vision, machine translation, reasoning, attention and memory, robotics and control, etc. These techniques are used in a broad range of applications, spanning areas as diverse as health diagnostics, educational tutoring, autonomous driving, or sentencing in the criminal justice context, to name just a few areas of great societal importance. From a legal and regulatory perspective, the term AI is often used to describe a quality that cuts across some of these applications: the degree of autonomy of such systems that impact human behavior and evolve dynamically in ways that are at times even surprising to their developers. Either way, whether using a more technical or phenomenological definition, the justification and timing of any legal or regulatory intervention as well as the selection of governance instruments will require a careful contextual analysis in order to be technically workable and avoid both overgeneralization as well as unintended consequences. Given the breadth and scope of application, AI-based technologies are expected to trigger a myriad of legal and regulatory issues not only at the intersections of data and algorithms, but also of infrastructures and humans. As a growing number of increasingly impactful AI technologies make their way out of research labs and turn into industry applications, legal and regulatory systems will be confronted with a multitude of issues of different levels of complexity that need to be addressed. Both lawmakers and regulators as well as other actors will be affected by the pressure that AI-based applications place on the legal system (here as a response system), including courts, law enforcement, and lawyers, which highlights the importance of knowledge transfer and education (more on this point below). Given the (relative) speed of development, scale, and potential impact of AI development and deployment, lawmakers and regulators will have to prioritize among the issues to be addressed in order to ensure the quality of legal processes and outcomes — and to avoid unintended consequences of interventions. Trending issues that seem to have a relatively high priority include questions around bias and discrimination of AI-based applications, security vulnerabilities, privacy implications of such highly interconnected systems, conceptions of ownership and intellectual property rights over AI creative works, and issues related to liability of AI systems, with intermediary liability perhaps at the forefront. While an analytical framework to categorize these legal questions is currently missing, one might consider a layered model such as a version of the interop “cake model” developed elsewhere in order to map and cluster these emerging issues. Gesture Recognition by Comixboy, licensed under the Creative Commons Attribution 2.5 Generic license. When considering (or anticipating) possible responses by the law vis-à-vis AI innovation, it might be helpful to differentiate between application-specific and cross-cutting legal and regulatory issues. As noted, AI-based technologies will affect almost all areas of society. From a legal and regulatory perspective, it is important to understand that new applications and systems driven by AI will not evolve and be deployed in a vacuum. In fact, many areas where AI is expected to have the biggest impact are already heavily regulated industries — consider the transportation, health, and finance sectors. Many of the emerging legal issues around specific AI applications will need to be explored in these “sectoral” contexts. In these areas, the legal system is likely to follow traditional response patterns when dealing with technological innovation, with a default on the application of existing norms to the new phenomenon and, where necessary, gradual reform of existing laws. Take the recently approved German regulation of self-driving cars as an example, which came in the form of an amendment to the existing Road Traffic Act. In parallel, a set of cross-cutting issues is emerging, which will likely be more challenging to deal with and might require more substantive innovation within the legal system itself. Consider for instance questions about appropriate levels of interoperability in the AI ecosystem at the technical, data, and platform layers as well as among many different players, issues related to diversity and inclusion, and evolving notions of the transparency, accountability, explainability, and fairness of AI systems. Information asymmetries and high degrees of uncertainty pose particular difficulty to the design of appropriate legal and regulatory responses to AI innovations — and require learning systems. AI-based applications — which are typically perceived as “black boxes” — affect a significant number of people, yet there are nonetheless relatively few people who develop and understand AI-based technologies. This information asymmetry also exists between the technical AI experts on the one hand, and actors in the legal and regulatory systems on the other hand, who are both involved in the design of appropriate legal and regulatory regimes, which points to a significant educational and translational challenge. Further, even technical experts may disagree on certain issues the law will need to address — for instance, to what extent a given AI system can or should be explained with respect to individual decisions made by such systems. These conditions of uncertainty in terms of available knowledge about AI technology are amplified by normative uncertainties: people and societies will need time to build consensus among values, ethics, and social norm baselines that can guide future legislation and regulation, the latter two of which also have to manage value trade-offs. Together, lawmakers and regulators have to deal with a tech environment characterized by uncertainty and complexity, paired with business dynamics that seem to reward time-to-market at all cost, highlighting the importance of creating highly adaptive and responsive legal systems that can be adjusted as new insights become available. This is not a trivial institutional challenge for the legal system and will likely require new instruments for learning and feedback-loops, beyond traditional sunset clauses and periodic reviews. Approaches such as regulation 2.0, which relies on dynamic, real-time, and data-driven accountability models, might provide interesting starting points. The responses to a variety of legal and regulatory issues across different areas of distributed applications will likely result in a complex set of sector-specific norms, which are likely to vary across jurisdictions. Different legal and regulatory regimes aimed at governing the same phenomenon are of course not new and are closely linked to the idea of jurisdiction. In fact, the competition among jurisdictions and their respective regimes is often said to have positive effects by serving as a source of learning and potentially a force for a “race to the top.” However, discrepancies among legal regimes can also create barriers when harnessing the full benefits of the new technology. Examples include not only differences in law across nation states or federal and/or state jurisdictions, but also normative differences among different sectors. Consider, for example, the different approaches to privacy and data protection in the US vs. Europe and the implications for data transfers, an autonomous vehicle crossing state boundaries, or barriers to sharing data for public health research across sectors due to diverging privacy standards. These differences might affect the application as well as the development of AI tech itself. For instance, it is argued that the relatively lax privacy standards in China have contributed to its role as a leader in facial recognition technology. In the age of AI, the creation of appropriate levels of legal interoperability — the working together of legal norms across different bodies and hierarchy of norms and among jurisdictions — is likely to become a key topic when designing next-generation laws and regulations. Law and regulation may constrain behavior yet also act as enablers and levelers — and are powerful tools as we aim for the development of AI for social good. In debates about the relationship between digital technology and the law, the legal system and regulation are often characterized as an impediment to innovation, as a body of norms that tells people what not to do. Such a characterization of law is inadequate and unhelpful, as some of my previous research argues. In fact, law serves several different functions, among them the role of an enabler and a leveler. The emerging debate about the “regulation of AI” will benefit from a more nuanced understanding of the functions of law and its interplay with innovation. Not only has the law already played an enabling role in the development of a growing AI ecosystem — consider the role of IP (such as patents and trade secrets) and contract law when looking at the business models of the big AI companies, or the importance of immigration law when considering the quest for talent — but law will also set the ground for the market entry of many AI-based applications, including autonomous vehicles, the use of AI-based technology in schools, the health sector, smart cities, and the like. Similarly, law’s performance in the AI context is not only about managing its risk, but is also about principled ways to unleash its full benefits, particularly for the social good — which might require managing adequate levels of openness of the AI ecosystem over time. In order to serve these functions, law needs to overcome its negative reputation in large parts of the tech community, and legal scholars and practitioners play an important educational and translational role in this respect. Innovation by Boegh, Creative Commons Attribution 2.0 Generic license. Law is one important approach to the governance of AI-based technologies. But lawmakers and regulators have to consider the full potential of available instruments in the governance toolbox. Over the past two decades of debate about the regulation of distributed technologies with global impact, rough consensus has emerged in the scholarly community that a governance approach is often the most promising conceptual starting point when looking for appropriate “rules of the game” for a new technology, spanning a diverse set of norms, control mechanisms, and distributed actors that characterize the post-regulatory state. At a fundamental level, a governance approach to AI-based technologies embraces and activates a variety of modes of regulation, including technology, social norms, markets and law, and combines these instruments with a blended governance framework. (The idea of combining different forms of regulation beyond law is not new and, as applied to the information environment, is deeply anchored in the Chicago-school and was popularized by Lawrence Lessig.) From this ‘blended governance’ perspective, the main challenge is to identify and activate the most efficient, effective, and legitimate modalities for any given issue, and to successfully orchestrate the interplay among them. A series of advanced regulatory models that have been developed over the past decades (such as the active matrix theory, polycentric governance, hybrid regulation, and mesh regulation, among others) can provide conceptual guidance on how such blended approaches might be designed and applied across multiple layers of governance. From a process perspective, AI governance will require distributed multi-stakeholder involvement, typically bringing together civil society, government, the private sector, and the technical and academic community — collaborating across the different phases of a governance lifecycle. Again, lessons regarding the promise and limitations of multi-stakeholder approaches can be drawn from other areas, including Internet governance, nanotechnology regulation, or gene drive governance, to name just a few. In a world of advanced AI technologies and new governance approaches towards them, the law, the rule of law, and human rights remain critical bodies of norms. The previous paragraph introduced a broader governance perspective when it comes to the “regulation” (broadly defined) of issues associated with AI-based applications. It characterized the law as only one, albeit important, instrument among others. Critics argue that in such a “regulatory paradigm,” law is typically reduced to a neutral instrument for social engineering in view of certain policy goals and can be replaced or mixed with other tools depending on its effectiveness and efficiency. A relational conception of law, however, sees it neither as instrumentalist nor autonomous. Rather, such a conception highlights the normativity of law as an institutional order that guides individuals, corporations, governments, and other actors in society, ultimately aiming (according to one prominent school of thought) for justice, legal certainty, and purposiveness. Such a normative conception of law (or at least a version of it), which takes seriously the autonomy of the individual human actor, seems particularly relevant and valuable as a perspective in the age of AI, where technology starts to make decisions that were previously left to the individual human driver, news reader, voter, judge, etc. A relational conception of law also sees the interaction of law and technology as co-constitutive, both in terms of design and usage — opening the door for a more productive and forward-looking conversation about the governance of AI systems. As one starting point for such a dialogue, consider the notion of society-in-the-loop. Recent initiatives such as the IEEE Global Initiative on Ethically Aligned Design further illustrate how fundamental norms embedded in law might guide the creation and design of AI in the future, and how human rights might serve a source of AI ethics when aiming for the social good, at least in the Western hemisphere. As AI applies to the legal system itself, however, the rule of law might have to be re-imagined and the law re-coded in the longer run. The rise of AI leads not only to questions about the ways in which the legal system can or should regulate it in its various manifestations, but also the application of AI-based technologies to law itself. Examples of this include the use of AI that supports the (human) application of law, for instance to improve governmental efficiency and effectiveness when it comes to the allocation of resources, or to aid auditing and law enforcement functions. More than simply offering support, emerging AI systems may also increasingly guide decisions regarding the application of law. “Adjudication by algorithms” is likely to play a role in areas where risk-based forecasts are central to the application of law. Finally, the future relationship between AI and the law is likely to become even more deeply intertwined, as demonstrated by the idea of embedding legal norms (and even human rights, see above) into AI systems by design. Implementations of such approaches might take different forms, including “hardwiring” autonomous systems in such ways that they obey the law, or by creating AI oversight programs (“AI guardians”) to watch over operational ones. Finally, AI-based technologies are likely to be involved in the future creation of law, for instance through “rule-making by robots,” where machine learning meets agent-based modeling, or the vision of an AI-based “legal singularity.” At least some of these scenarios might eventually require novel approaches and a reimagination of the role of law in its many formal and procedural aspects in order to translate them into the world of AI, and as such, some of today’s laws will need to be re-coded. Thanks to the Special Projects Berkman Klein Center summer interns for research assistance and support.
https://medium.com/berkman-klein-center/ai-and-the-law-setting-the-stage-48516fda1b11
['Urs Gasser']
2017-06-26 21:42:41.642000+00:00
['Governance And Tech', 'Algorithms', 'Law', 'Artificial Intelligence', 'Data']
How to Password Protect USB Flash Drive without BitLocker
As a portable storage device, USB flash drive is usually used for data backup and transfer. However, it is also because of its portability that makes USB flash drive easy to lose leading to data leakage. For more security and privacy, people are used to password protect USB flash drive with BitLocker Drive Encryption. In this article, we will walk you through three kinds of methods to password protect USB flash drive without BitLocker. ● Method 1: Encrypt the whole USB flash drive It is the most effective way to password protect USB flash drive by encrypting the entire drive. BitLocker Drive Encryption is capable of encrypting entire hard drives. In addition, there are also many other encryption tools that can encrypt entire drives like VeraCrypt and TrueCrypt. The following guide takes VeraCrypt as an example to show you how to password protect USB flash drive by encrypting the entire drive. Step 1: Download this tool and install it on your Windows computer. Then insert your USB drive into your pc and launch this tool. Step 2: On the VeraCrypt window, click on Create Volume button. Step 3: Select Encrypt a non-system partition/drive option and then click on Next button. Step 4: Select Standard VeraCrypt volume option and then click on Next button. Step 5: Click Select Device button. On a new window, select the USB flash drive and click on OK button. Then click on Next button. Step 6: Choose a volume creation mode and click on Next button. If you have data stored on USB drive, choose Encrypt partition in place option. Otherwise, choose the other one. Step 7: Choose an encryption and hash algorithm for the encryption and click on Next button. If you have no idea, it would better to leave the default settings. Step 8: Click on Next button. Step 9: Type and confirm a password and click on Next button. Step 10: Finally, wait for a while until the VeraCrypt prompts you that the USB drive has been encrypted successfully. ● Method 2: Encrypt all the files on USB flash drive If you are worried that encrypting USB drive may slow down the speed of read and write, you can choose to only encrypt the files or folders on USB flash drive to password protect USB drive. WinRAR is a file archiver utility for Windows, which can also allow users to set passwords for the files and archives. The followings cover a full guide to use WinRAR to put a password on USB flash drive. Step 1: Download this tool and install it on your Windows computer. Then insert your USB drive into your pc and launch this tool. Step 2: On the interface of WinRAR, navigate to the files or folders on USB flash drive. Step 3: Select the files or folders that you want to encrypt and then click on Add option in the top left corner of interface. (You can select all the files and folder at the same time.) Step 4: On a new window, click Set password button. Before setting a password, you can choose the archive format and location of archive file. Step 5: Type a password and check the box next to Encrypt file names. Then click on OK button. Step 6: Click on OK button. An archive file is created with password. Then you can remove the original files or folders and save the archive file with password protection on USB flash drive. Finally, even without BitLocker, you can also password protect USB flash drive successfully. ● Method 3: Use a USB flash drive with protection itself For more security and privacy, you can use a USB flash drive with hardware protection itself. This kind of pen drive employs an independent hardware encryption so that it can’t be read by computer without password even if you plug it into your computer, and it’s almost impossible to crack. So, if you need to store or transfer some extremely important data, it is highly recommended. External tips: How to encrypt USB flash drive with BitLocker As a Windows built-in encryption tool, BitLocker Drive Encryption is the first choice for us to password protect USB flash drive. For more convenience and efficiency, we can use a third-party tool iSunshare BitLocker Genius for Windows to encrypt USB flash drive with BitLocker. It provides a more intuitive and user-friendly interface built with detailed introductions for every step and make it easier to encrypt USB flash drive with BitLocker. Here are detailed steps. Step 1: Download this tool and install it on your PC. Then insert USB drive into the PC and launch this tool. Step 2: On the interface, right-click the pen drive and select Turn on BitLocker option. Step 3: Enter and confirm a password and then click save to file button to back up the recovery key as a .txt file. Step 4: Then choose where to save the new recovery key file and click Save button. Step 5: Click Encrypt button to start encrypting. Wait for a while until a window pops up prompting that the encryption is successful. Step 6: Right-click the hard drive and select Lock Drive option. Finally, the pen drive is password protected with BitLocker.
https://medium.com/@a18320355510/how-to-password-protect-usb-flash-drive-without-bitlocker-e9239655dbd
[]
2020-11-18 01:30:25.587000+00:00
['Usb Flash Drives', 'Bitlocker', 'Encryption']
I Almost Cast a Dead Man’s Ballot
The poll worker started to hand me my ballot this morning, but then she looked down at her records and pulled it right out of my hands. “I’m sorry, but I can’t find you in the rolls. Are you sure you’re registered?” I wasn’t nervous yet. “Of course, Mary” I said, handing her back my driver’s license. “I voted right here in the primaries. You were two chairs over then. Remember?” She shrugged, body language telling me she obviously can’t remember one voter out of hundreds from months ago. As she double checked, my mind raced back to the last time I had stepped foot inside my Michigan township hall. The folding tables now littered with the accoutrements of democracy had groaned then with potluck and steam trays. The church ladies all showed up the weekend before Labor Day to memorialize my dad with three-bean salad, chicken casserole, and heavenly hash. I was a wreck. No appetite. Pasted-on smile. I picked at my styrofoam plate, waved goodbye to extended family, and drove to the church to collect flowers and my dad’s ashes. Didn’t know how I’d make it through the next few days. Mary, who lives just a few houses down the street, brought me back to the present. “I’m sorry, but you’re definitely not registered to vote.” My face must have shown my shock. A young woman in chic business wear glided over. “I’ll just check it on the computer in the office,” she said, smoothing my ID out of Mary’s hand and walking back into a room I know is a kitchen. It holds giant coffee percolators like your grandparents might have owned in the 1960s. Mary ran after the polling supervisor, whispering. They were gone much longer than I thought reasonable, and my anxiety mounted to anger. I’m one of the few Democrats in this village. I’m the ONLY Democrat I know about. The reason I thought Mary would remember me from the primaries is that she had to hunt down a Democratic ballot for me. “What the hell is going on?” I thought to myself. “Somebody must have pulled a dirty trick!” Then the supervisor slipped back out of the kitchen, professional and soothing. “It’s all right, sir; I’ve taken care of it. You can vote. Thanks for showing up today.” I raised one eyebrow, a skill I’ve been working on since I was a snotty 16 year old. “But I don’t understand. I know I was registered. I voted in the primaries. How can …” “It’s OK, sir,” she soothed. “Michigan allows same-day registration, so I’ve re-registered you. You can take your ballot now.” “Mary?” I asked looking over to where my neighbor had sat back down. She opened her mouth, closed it firmly shut, then sighed and spoke. “Jim, it was your dad. You guys have exactly the same name and address. That’s why I started to hand you a ballot. But when he passed? Somebody struck you from the rolls instead of him. It was just a mistake.” A gay, Bernie Sanders progressive in Trump country How a radical progressive like me came to live in rural Republican Michigan is a boring story, so I won’t tell it, but caring for my terminally ill father plays a role. In 2016, I was still living in Detroit, where a friend and I chose to spend Election Night in a bar on the Wayne State campus, sure we’d be celebrating Clinton’s victory with fellow liberals. Trump’s elevation to power shocked me. Shook me. Changed me. It didn’t happen overnight. My friend, a naturalized citizen who’s lived in strong-man autocracies and fragile democracies, tried to warn me. “Things are going to get really bad,” he said. “Then they’re going to get worse. Gay people like you and brown people like me have the most to lose. You’d better start making plans. Think you can get an Irish passport?” I scoffed and scolded my friend, lecturing him that horrible night on the stability of American democracy. What a fool I was. Trumpism ate at my soul I’ve seen things in the last four years I thought I could never happen again in the US: I’ve watched white supremacists and explicit racists march openly in the streets, praised and supported by the president and his party’s power structure. I’ve watched foes of racism and fascism vilified and demonized as un-American terrorists. I’ve watched open homophobia and transphobia normalized and made socially acceptable. I’ve watched civil rights divisions in the federal government act on Orwellian re-tasking to deny civil rights to LGBTQ people. I’ve watched officers of the United States turn away brown asylum seekers illegally and treat them like animals. I’ve watched officers of the United States rip children from the arms of parents, with close to 600 children fated to probably never see their parents again — because nobody bothered to keep records. I’ve watched the president mock simple precautions against a deadly pandemic, effectively encouraging his followers not to wear masks or socially distance even though we know for certain such measures could save hundreds of thousands of lives. I’ve watched the president’s followers accept his lies with unquestioning loyalty. I’ve watched my neighbors do it. I’ve watched the president and his political party work to subvert democracy by making voting difficult, by suing in court to have ballots thrown away, and by suing to stop ballots from being counted. I’ve watched genuine if flawed liberal democracy begin to rot on the vine. Will marginalized people ever feel safe again? My immigrant friend and I will not be watching returns together tonight. He’s moved across the country. We’ve each voted. We’re each holding our breath. We’re each frightened but clinging to optimism. One advantage he holds over me, however, is multiple passports. He’s already let me know he’s not kidding: if Trump wins today’s election, he’s gone. He’d rather take his chances elsewhere than risk his brown skin and “suspect” ethnic origin in Trump’s America. I don’t have that choice. My American passport is the only one I’m entitled to. If Trump wins again, I won’t feel safe, but I’ll have to tough it out and keep my head down. Will any marginalized people feel safe? In four years, Trump has unleashed white supremacy, racism, xenophobia, and anti-LGBTQ, anti-woman nonsense, transforming the United States into a no-go zone for members of minorities. I almost didn’t get to vote this morning The only reason a clerical error did not disenfranchise me is that Michigan voters passed Proposition 3 in the 2018 midterm elections, allowing election-day registration. That seems like a no-brainer idea. Why shouldn’t we empower as many voters as possible? But Trump and his people fought Prop 3 tooth and nail. It passed on a surge of progressive voter engagement many say was part of a popular revolt against the president. So, here I sit, typing away in my rural Michigan kitchen in the middle of Trump country, and I grind my teeth. I did not cast my dead father’s ballot. I succeeded in casting my own, but I fear that may not be enough. Maybe by this time tomorrow, I’ll know if I’m fated to become even more of a stranger in a strange land. I pray healing can begin instead.
https://medium.com/prismnpen/i-almost-cast-a-dead-mans-ballot-b763969d6743
['James Finn']
2020-11-03 19:12:40.425000+00:00
['LGBTQ', 'Politics', 'Election 2020', 'Equality', 'Creative Non Fiction']
As Technology Rises, it’s Easy to Ignore the Destruction it Causes for Many Students Worldwide
This is a pre-class assignment due Sunday December 6th @11:59pm in preparation for Ethan & myself to lead class Tuesday. Especially in the age of a global pandemic when nearly everything has moved online without much of a choice, it seems that while some are thriving with these changes, others have never struggled more. Now, we not only rely on the internet for essentials such as virtual doctors appointments and social connectivity, but for students, our entire education has transitioned online. Whether it be classes, completing/submitting homework and classwork, communicating with teachers/students or viewing grades — lacking an access to technology would inevitably halt success. And while the people in power are definitely working towards fixing this problem, we are still failing. Even in late October 2020 when remote learning had occurred for over 6 months, 77,000 NYC students were without internet or tablets. If this fact does not prove that inequality is fueled by the rise in technology, then I don’t know what will. Now, if I were to plan a mini lesson with the mindset of technology/education/inequality, this is what it would look like. First, students would read the two articles listed below (the first one discusses the broader issues of remote learning and gives statistics about virtual schooling, and the second one is more specific to this crisis in our own city): #1: https://www.theedadvocate.org/the-absence-of-internet-at-home-is-a-problem-for-some-students/ #2: https://www.pix11.com/news/back-to-school/77-000-nyc-students-struggle-without-tablets-internet-for-remote-learning After reading these two articles, I would have students first react to it (by writing a critical response), then find 1 graph/image they think relates to the topic (potentially aids understanding), and third, they would answer the two questions below in class discussion: #1: What did inequality in education look like before we began relying on the internet for the entirety of our schooling? #2: After recognizing the inequality sparked by the pandemic, what are some changes we can make that would allow education to be more accessible to everyone (if there is anything)?
https://medium.com/@md986q/as-technology-rises-its-easy-to-ignore-the-destruction-it-causes-for-many-students-worldwide-fb68e24b6faf
['Maya Degnemark']
2020-12-05 20:57:09.212000+00:00
['Inequality', 'Students', 'Schools', 'Technology', 'Education']
สภาวะ “แกะดำ”
Don’t let your dreams be dreams.
https://medium.com/@jatdilok/%E0%B8%AA%E0%B8%A0%E0%B8%B2%E0%B8%A7%E0%B8%B0-%E0%B9%81%E0%B8%81%E0%B8%B0%E0%B8%94%E0%B8%B3-f54205ee0ce1
['Jatdilok Raksutee']
2020-12-24 17:27:59.236000+00:00
['Thailand', 'Blog', 'Thai']
November Origin Rewards: Celebrating Singles Day
We are always looking for ways to show our appreciation to our loyal community that believed in us from day one and helped build the global presence Origin has today. It’s why we launched Origin Rewards, and why we design new campaigns each month so that we can reward every user that is creating value for our network in different ways. Over ten thousand users all around the world have enrolled in Origin Rewards and became token holders, and more are receiving their stake of the Origin network each day. This month, we are pleased to partner with imToken, one of the largest Ethereum wallets in the world, and Slife, which runs multiple Chinese marketplaces such as the vacation rentals platform Ru Cheng, in bringing a special Single’s Day rewards event to our users in China. This partnership is a significant step as all three companies work together to increase the adoption of blockchain-powered marketplaces. We believe the only way to succeed in China is to work with the right local partners, like imToken and SLife, who have deep market knowledge and share common values in relentlessly pursuing excellence when it comes to product and customer experience. How to earn OGN in November You can purchase featured listings this month to earn free OGN, receive DAI rebates and also win a mystery goodie bag worth 200CNY if you’re one of the first 50 buyers to complete a transaction! In order to start earning OGN, you will be asked to create a basic profile with your profile photo, name and email address. You can then further strengthen your profile by verifying your social accounts like Github, LinkedIn, Kakao, Facebook, Website, Airbnb, Twitter and Google. You will be rewarded 10 OGN for verifying each of these accounts. You need to verify at least 3 social accounts to progress to the next level. We encourage filling out your profile because the more complete your profile is, the higher your chances are of having someone else transacting with you on the Origin marketplace! An annual membership for unlimited vacation rentals in China A 7 day Mobike pass A subscription service that sends you mystery boxes containing food and beverages monthly Please note that Origin Rewards is currently available in most countries in the world. One notable exception is the United States and countries that are currently under U.S. sanctions like North Korea and Cuba. These restrictions go against our ideals and we’re truly sorry for anyone who is unable to participate as a result. Any tokens earned in November will be distributed after the first week of December (subject to terms and conditions). Explore other featured listings and learn more about our Singles Day event here. Learn more about Origin:
https://medium.com/originprotocol/november-origin-rewards-celebrating-singles-day-4517e087aa3c
['Anna Wang']
2020-01-17 19:39:29.313000+00:00
['Ethereum', 'Blockchain', 'Rewards', 'Cryptocurrency', 'Marketplaces']
DATABASE CONNECTION USING EXPRESS AND SEQUELIZE: PART 2
DATABASE CONNECTION USING EXPRESS AND SEQUELIZE: PART 2 PART 2A: USING EXPRESS GENERATOR STEP 1 Open command prompt. STEP 2 Create a folder for the node project in the location of your choice. Navigate to the folder, and then type Make directory to the name of the folder $ mkdir Article1 Change directory to the folder created $ cd Article1 Run $ code . on the terminal to take you to the code editor STEP 3 We run $ npx express-generator in order to generate the express server We can install express globally for our projects $ npm install -g express-generator The project is created with series of directories as javascripts, stylesheets, routes, views and with files as package.json, app.js and few other files. This also specifies the next step to be executed (the command to install the dependencies) STEP 4 We run $ npm install Lets view the package.json file that is created. All the dependencies needed for the project is added here automatically. A directory with the name node_modules is created in the project folder and all the dependencies are added automatically in node_modules directory. The project setup is ready. We open the app.js and see what is embedded in it . All the dependencies are invoked by the use of require method The use of modules in app.js: * body-parser — A middleware for handling Raw, Text, JSON and URL encoded form data(especially POST data). * Cookie-Parser — To parse Cookie header and populate req.cookies with an object keyed by the cookie names. * Jade-A template engine . It combines data and the template to produce HTML. * Express — A web application framework that provides a set of features for web and mobile applications * Morgan — A HTTP request logger middleware for node.js Now, we run our application from the command line $ npm start The server gets started and listens to port 3002. Open the browser and load You will get the output as shown below. PART 2B: SEQUELIZE Sequelize is an abstraction layer for raw SQL that enables us to use javascript to interact with our database.* STEP 1 On the command line we install sequelize $ npm i--save sequelize STEP 2 We install MYSQL2 for storing application related information $ npm i mysql2 STEP 3 We install the sequelize-cli on the command line inorder to take advantage of sequelize capabilities $ npm install --save-dev sequelize-cli STEP 3 To create an empty project we execute sequelize-cli init on the command line $ npx sequelize-cli init It will create the following folders: Config Folder: That contains config file which tells CLI(Command Line Interface) how to connect with database. 2. Models Folder: Contains models for your project. 3. Migrations Folder: Contains all migration files. 4. Seeders: Contains all seed files. STEP 4 We create the MYSQL database with this command $ npx sequelize-cli db:create This database created can be seen on our localhost phpmyAdmin. STEP 5 We create a model file which sequelize will use to organize the information and a migration which sequelize will use to set up the tables in our database.The model files are created based on the schema given. We create a user model from the command line. $ npx sequelize-cli model:generate — name User — attributes firstName:string,lastName:string,email:string,password:string We create a task model from the command line. $ npx sequelize-cli model:generate — name Task — attributes title:string,userId:integer STEP 6 To commit the User and Task table in the database we run this command: $ npx sequelize-cli db:migrate STEP 7 To undo migration we run this on the command line: $ npx sequelize-cli db:migrate:undo STEP 8 To manage all migrations we use seeders. Seed files are some changes in data that can be used to populate database table. $ npx sequelize-cli seed:generate — name user npx sequelize-cli seed:generate — name task* It create seed file in the seeders folder. It follows the semantics as the migration files. STEP 9 We run the command below on the command line for the seed file to be committed to the database. $ npx sequelize-cli db:seed:all STEP 10 To undo seed from database we run this command $ npx sequelize-cli db:seed:undo STEP 11 In the model folder we locate the user.js and task.js. We create their associations. User association use strict’; const { Model } = require(‘sequelize’); module.exports = (sequelize, DataTypes) => { class User extends Model { /** Helper method for defining associations. This method is not a part of Sequelize lifecycle. The `models/index` file will call this method automatically. / // static associate(models) { // // define association here // } }; User.associate = function(models){ User.hasMany(models.Task) } User.init({ firstName: DataTypes.STRING, lastName: DataTypes.STRING, email: DataTypes.STRING, password: DataTypes.STRING }, { sequelize, modelName: ‘User’, }); return User; }; Task association ’use strict’; const { Model, BelongsTo } = require(‘sequelize’); module.exports = (sequelize, DataTypes) => { class Task extends Model { /** Helper method for defining associations. This method is not a part of Sequelize lifecycle. The `models/index` file will call this method automatically. / // static associate(models) { // // define association here // } }; Task.associate = function(models){ Task.belongsTo(models.User) } Task.init({ title: DataTypes.STRING, userId: DataTypes.INTEGER }, sequelize, modelName: ‘Task’, }); return Task; }; STEP 12 We run the command on STEP 6 and STEP 9. STEP 13 we create the task route and make changes in the user route User route var express = require(‘express’); var router = express.Router(); const controller = require(‘../controllers/user.controller’); router.get(‘/:id’, controller.getUser); router.post(‘/’, controller.createUser); router.put(‘/:id’, controller.updateUser); router.delete(‘/:id’, controller.deleteUser); module.exports = router;``` Task route const express = require(‘express’); const router = express.Router(); const taskController = require (‘../controllers/task.controller’); router.get (‘/’, taskController.getTask); router.get(‘/user/:id’,taskController.getTasks); router.post (‘/create/:id’, taskController.createTask); router.put (‘/:id’, taskController.updateTask); router.delete (‘/:id’, taskController.deleteTask); module.exports = router; The routes send the request to a specific action in the controller and determines the path by which the actions(packets) are forwarded or shared. STEP 14 We create the controllers for user and task that controls the incoming request,catches error and sends back a response to the client. User controller const models = require(‘../models/index’); async function getUser(req,res){ userId = req.params.id; const user = await models.User.findOne({where:{id:userId},attributes:[‘firstname’,’lastname’]}) res.json(user); } async function createUser(req,res){ var data = req.body; var user, msg; const checkUser = await models.User.findOne({where:{email:data.email}}); if (checkUser){ msg = “Sorry you already have an account” } else { const user = await models.User.create({firstname:data.firstname, lastname:data.lastname,email:data.email, password:data.password}); msg = “Account successfully created” } res.json(msg); } async function updateUser(req,res){ userId = req.params.id; var data = req.body; const user = await models.User.update({firstname:data.firstname, lastname:data.lastname,email:data.email, password:data.password},{where:{id:userId}}); res.json({msg:’User updated successfully’}) } async function deleteUser(req,res){ userId = req.params.id; const user = await models.User.destroy({where:{id:req.params.id}}); res.json({mssg:’user deleted’}) } module.exports = { getUser, createUser, updateUser, deleteUser } Task controller const models = require(‘../models/index’); async function getTask(req, res) { const task = await models.Task.findAll({include:[models.user]}); res.json(task); } async function getTasks(req,res){ userId = req.params.id; const tasks = await models.Task.findAndCountAll({where:{userId:userId}}) res.json(tasks) } async function createTask(req, res) { userId = req.params.id; var data = req.body; const task = await models.Task.create({title:data.title,userId:userId}); res.json(task); } async function updateTask(req, res) { var data = req.body; msg = req.params.id; const task = await models.Task.update(res.body,{where: {id:msg} }); msg = ‘Update Successful’ res.json(msg); } async function deleteTask (req, res) { var userId = req.params.id; const task = await models.Task.destroy({where:{id: userId}}) res.send(‘deleted’) } module.exports = { getTask, getTasks, createTask, updateTask, deleteTask, }; * GET REQUEST: help us FIND handles. * POST REQUEST: help us SAVE handles. * UPDATE REQUEST: help us to UPDATE saved handles. *DELETE REQUEST: help us to DELETE saved handles. We run our server using $ npm start or by installing Nodemon $ npm install --save-dev nodemon it restart the server. We carry out or check all actions using POSTMAN.
https://medium.com/@olawoyelydiatosin/database-connection-using-express-and-sequelize-part-2-e8743af79b7e
['Lydia Tosin']
2020-11-24 11:44:25.743000+00:00
['Nodejs', 'Sequelize', 'Expressjs', 'Sequelize Cli']
It’s All In The Fun
An Acrostic Poem Photo by Braydon Anderson on Unsplash The challenge isn’t always as easy as it seems Hard makes it much more fun Attack it with your heart, your soul, and mind That will win you kudos every time What’s that you say so quickly Always up for a challenge entertainingly Sitting on the sidelines won’t help you Fortune always favors the bold Unless you get in and give it your best shot Never will you find your finest day as a poet
https://medium.com/passive-asset/its-all-in-the-fun-b93fc65bd1d5
['Janny S Heart']
2020-12-16 19:50:07.049000+00:00
['Illumination', 'Creativity', 'Poetry On Medium', 'Fun', 'Poetry']
The future of our food
At the beginning of the 20th century, the human population of planet Earth was of 2 billion people. With the breakthrough advances of the century in technology and science, the human population of the planet grew by 5 billion people. (Staniford) Due to the fast pace at which population was growing at the time, many scientists from the 20th century predicted global hunger and social unrest by the early 21st century. Nevertheless, we were able to prevent famine with the so-called Green Revolution of the 20th century. (Borlaug) This Revolution consisted of better agricultural methods for irrigation and more technologies being applied to our traditional forms of agriculture together with the larger usage of land for the production of food. The improvements in agriculture efficiency led to much more agricultural production than what was produced at the beginning of the century. Thanks to the Green Revolution, our world has been able to feed most of the increasing human populations and to produce a food surplus in some countries at the same time. However, the Green Revolution is reaching its maximum production capability today in the 21st century, while the world population continues to increase. Our world population is estimated to reach 9 billion people in 50 years. (Staniford) Our traditional food production is estimated to grow at a much smaller pace, than that of our population. (Borlaug) In order to prevent world hunger and global social unrest, our world’s scientist, farmers, politicians, corporations, and entrepreneurs are looking into new technologies to produce more and better food. One of the technologies that are being considered as a solution is Genetic Engineering. Genetic modification of food could have the capacity of solving our world’s hunger problems in the not so distant future by modifying crops to grow faster, stronger and with better nutritional values. Genetic Modification of crops could give our societies the capacity for feeding all of its members. By doing so, our world is able to maintain social systems and maybe even lowering the differences between social classes with more availability of food. Farmers and agricultural corporations would be able to give a desired plant certain traits that it wouldn’t naturally have. Such traits would come from the DNA of other organisms and would allow the crop to grow in more places, and variable conditions. (Borlaug) With this method, we could design crops that will never grow naturally in the arid lands of Africa, where hunger seems to be an increasing problem. By using genetic modification to improve our crops we could have tastier foods with much more nutritional value, and that are able to grow in non-traditional weather conditions (Phillips) Because crops would be able to grow in harder conditions, with more nutritious value, in bigger quantities, and with more natural advantages, more food and better would be then produced. Even though genetically modified foods possess all of these positive qualities, they are not liked by everyone. Some consider genetically modified foods as a danger towards life itself. Those who dislike this type of crops have previously argued that they are not as nutritional as natural ones, that they are not as healthy and that they might become weeds in the near future. Nevertheless, previous experiments by the FDA have shown that these foods are as nutritional as organic crops. (Phillips) Actually, most of the fears against genetically modified crops seem to come from speculation because as Phillips points out in his article, genetic modified foods and crops have already been widely introduced in markets like the United States, Brazil, and Argentina. Genetic modification of food has actually proved to be positive environmentally and economically. These crops are being designed by people who have studied the environment and who know how to get the best of its kind of the desired species. Due to the fact that genetic engineering has the capacity of increasing amount and quality of crops, even with changing soil and climate conditions, it seems to be becoming the future of agriculture and food production for our world. Genetic modification of crops can actually be beneficial in more ways than just food production. Besides having the capability of producing better crops, the consumption of pesticides and insecticides could be reduced with genetic engineering. By making crops that produce their own pesticides and insecticides that kill undesired organisms, the use of petrochemical products by farmers would hugely decrease. (Phillips) Therefore the national energy security of countries that choose to use the method would also improve. Even less water could be used when growing these crops since they can be designed with genes from other organisms that would allow them to be more effective. In countries like Brazil and the United States, genetic modification of crops could increase in a more effective way the production of better Biofuels. By genetically modifying crops to be of desirable sizes and with the most sugars as possible, more efficient Biofuels could be produced. Other genetically engineered plants which have been designed and could have been used for the benefit of humanity, but that were not commercialized are able to stop or slow growth of bacteria, others can produce opiate identical to that the human brain produces, others can produce serum albumin, which is used for fluid replacement. (Bray) Largely due to the bad reputation and public opinion on genetically modified crops, these crops are not yet economically feasible. It is in fact because of the industry’s market structure that the industry is largely controlled by a monopoly. The U.S. based genetic engineering company, Monsanto, manufactures 90% of the world genetically modified crops. This industry might represent an economical danger if not open to more companies that could work with farmers to maintain an economical structure similar to our current one. (Greenpeace) Other European countries represent 10% of the industry but can’t currently grow with all of the unnecessary regulations that the European Union has imposed on them. Since genetically modified crops have already proved to be beneficial, and not- dangerous in countries likes: United States, Canada, Brazil, and Argentina, it seems unnecessary, and even dumb to impose so many regulations on the industry. The only way we could have a better structure for the genetic modification of the food industry in our market systems would be if regulations were lowered and more firms would be included. With the higher competition of firms and less economic regulations on the industry, these beneficial and convenient crops for food could be produced. Genetically modified crops should be viewed as a form of technology which would enable us to effectively produce better food. As Phillips points out, genetic modification of food is not even a new technology; it is only an extension of the practices that are already done by traditional agriculture. It actually is even better than traditional methods because we are able to have more accurate and beneficial results. Although it is logical that some fear exists on the method, because of misinformation since many technologies have been feared through history. Even today some technologies like the photographic camera are considered dangerous by tribes and communities who take an irrational approach to analyze them. The longer it takes for us to realize the genetic modification of food is just a tool, the larger the world hunger problem will grow. Even the genetic modification industry problems could be fixed with a larger acceptance of the method, allowing more companies to get in the market. As these types of food become more important, more institutions like the EPA could even become involved with the production of such food, making it even more accurate and safe for everyone. If we do not support and genetic modification of food, world hunger and malnutrition will only continue to grow as the population grows because our traditional methods simply could not keep up with the demand. Our world’s population will continue to increase rapidly; meanwhile, most of our farmable land is already being used, our climate is changing and our soils are depleting, reasons for which traditional agriculture will not be able to keep up with the growth.3.3 million Children a year are already dying of malnutrition, and this number will only increase with the continuous population growth.
https://medium.com/silibrain/the-future-of-our-food-2330f8e7d6f5
['Roberto Baldizon']
2018-09-30 03:46:23.404000+00:00
['Future', 'Food', 'World', 'Industry', 'Genetics']
AION
Introduction: Inner battles and timewaves By Dov Sabastien Christmas What a strange main title. There is a good reason as it's named after…Nevermind. Let me tell you a story. Soon a book will be published that dives into the details, but for now, this should do. My story not only moves through time but is influenced by the physics of time itself. It draws upon events from the past and what’s to come in the future. It's a story of a classic battle fought by powerful adversaries. It’s also a battle that happens inside us as we combat our demons and fears, hopes and aspirations, and our innate drive to protect those who we love and protect. Over the next months, years and centuries (both past and future), I will share as much as I can of the story, not conventionally told. Today I am in Boston, Massachusetts, and I live across the river from MIT in the Back Bay community. I understand it's an exclusive and expensive part of Boston to live in. I wouldn't know specifically as I have only recently arrived here and I am living here with my very good friend, Cara. We live together and are working on some interesting theoretical physics. We have discovered that time can be hacked but I am getting way ahead of myself. Suffice to say it's important to close that loophole. That we reinstate the quantum time variable within the closed timeline curves. I am not sure how we can fix the time-wave disturbances or even reverse the damages done this far, but we must try. Speaking of damage I would like to say immediately, regarding the power outages after the accident at MIT, I am so sorry to those who suffered an injury or loss. This is partly why I am writing this blog. As a medium for healing and to be transparent. Though I can’t give away ‘the plot’ so to speak, at least not yet. So please enjoy as I add to this whenever I can get around to it.
https://medium.com/@dovsabastien/aion-88a8a4f75bfd
['Dov Sabastien']
2020-12-22 23:37:18.651000+00:00
['Time Travel', 'Physics', 'Boston', 'History', 'MIT']
Ultimate Test Matches
Ultimate Test Matches A new competition format for international competition. AFDF and Japan Ultimate Logos. WFDF recently announced a new sanctioned competition format for international ultimate: the test match. The WFDF press release tells us that the idea was proposed by AFDF who wanted to host two showcase games between the Australian and Japanese World Games teams and have those games recognised as sanctioned WFDF competitions. The proposal itself is very clear that these test matches should be full strength games and not just warm ups for bigger tournaments. So even though Australia and Japan will be fielding their World Games squads, the two test matches will be standalone winner-takes-all competitions. What Can We Hope To See From Test Matches? Test matches in other sports, like Rugby and Cricket, are well established events steeped in tradition. It will take a long time to build up the same spectacle in ultimate, but jumping on the Great Britain vs Australia rivalry bandwagon (as perpetuated by The Ashes) could make for a excellent event in the ultimate world. Certainly the USA and Canada could have extremely exciting test matches on a regular basis, much more easily than GB and Australia could. In fact, such events have happened in the past as team USA and team Canada have warmed up for Worlds and the World Games. The main problem, of course, with the test match format is the logistics of getting a full strength national team to another country to play one game of ultimate. We will more likely see small series of test matches (like the inaugural Australia vs Japan two-test series) but this is still a large undertaking for a an amateur sport. That being said, test matches do present an excellent opportunity for players to represent their countries and get more international playing experience.
https://medium.com/this-ultimate-life/ultimate-test-matches-fd78cbed93a9
['Luke Burgess-Yeo']
2017-09-18 10:01:42.993000+00:00
['Ultimate Frisbee', 'Sports', 'International', 'Japan', 'Australia']
Will COVID-19 grow China’s influence on the world stage?
Will COVID-19 grow China’s influence on the world stage? This appeared in The Millennial Source China’s rise over recent decades has been both feared and welcomed. Think pieces on whether China’s increased global prominence might lead to a resurgence of authoritarianism or the eradication of liberal norms on the international stage can be as found as easily as those arguing over whether Beijing’s “ peaceful rise “ will be a boon for global prosperity. A sense of ambivalence may be at the heart of many of these opinions. Much of the West has lamented China’s one-party communist state as the antithesis of what open, free societies around the globe should emulate and arguments persist over whether the country’s economic and technological prowess could be a model for developing nations. Although China is consistently ranked among the worst countries in the world in regards to all kinds of human freedoms — be they freedom of religion, freedom of the Internet, human rights, personal expression and access to information — the country has also taken immense strides economically and has lifted billions out of poverty in recent decades. With the economic and political fallout caused by the coronavirus set to be a force on both a national and international level in the coming months, if not years, the virus’ impact on the dominant powers in both the East and the West is likely to leave an enduring mark. With so much unknown in regards to the future of the virus, it’s nearly impossible to fully gauge the impact it will have when all is said and done, leaving a whole series of questions yet to be answered. Which country will gain access to the world’s first coronavirus vaccine? What effect will the virus have on the United States presidential election? Will there be a second wave of infections? Superpower disagreements In the wake of COVID-19, China and the US have sparred over issues ranging from how the virus originated, what the actual number of cases within each country are and which, if either, country has taken the adequate steps in the face of the pandemic. Over the past several months, Trump has publicly questioned whether Beijing purposely hid the true impact of the virus within its borders, claimed that China downplayed the threat when it first emerged and suggested that the virus could have leaked from a Chinese lab. According to the World Health Organization and the National Institute of Allergies and Infectious Diseases, an agency of the US Department of Health, there is no evidence that the virus leaked from a lab. China’s Foreign Ministry released a point-by-point rebuttal in response to assertions by the Trump administration which it has called inaccurate. Examples of US allegations addressed in the piece include the claim that the true origin of the virus is still being investigated by Chinese scientists. China replied by stating that government officials “provided timely information to the world in an open, transparent and responsible manner” and that any delay in public response was due to the need to “take time to study and understand” what was happening. According to the Associated Press, Chinese officials took six days to publicly warn the country of a likely pandemic. While partisans on both sides of the debate claim to have the facts on their side, analysts argue that both countries are losing credibility on the global stage amid the fighting. According to James Green, a former US government official who currently hosts Georgetown University’s “U.S.-China Dialogue Podcast,” both countries need to focus on what matters during an outbreak, namely, working together to see that it’s brought to an end. “Policymakers should not let geostrategic competition and the shifting international landscape cloud their thinking,” argued Green. “Now is the time to focus on what’s necessary to protect lives and restart the economy-not on dangerous distractions,” he added. China’s economic sway China’s international influence has drastically increased in recent decades. In addition to becoming a global economic powerhouse, China has made moves to expand its economic presence in emerging markets via a policy known as the Belt and Road Initiative (BRI). Since 2013, BRI has brought Chinese investment projects to over 60 countries, mostly by way of large-scale infrastructure projects. By 2027, economists at Morgan Stanley predict that these investments could total over US$1 trillion. While many developing countries stand to benefit from new infrastructure and the subsequent economic growth it would bring, observers warn that China’s investments could drastically increase the country’s global sway, both geo-politically and financially. According to Italian economist Michele Geraci, the former Italian undersecretary of state in Italy’s Ministry of Economic Development, COVID-19 could have a positive effect on China’s BRI outreach. “The type of industries that are involved in the Belt and Road, such as infrastructure, trains and roads, port development, are not things done indoors like small companies or service industries. So social distancing is easily achieved,” he argued. Geraci further believes that opening rifts with China economically would not be smart policy after the pandemic. “China supplies a lot of goods for European and American companies. If there were to be a breakdown of the economic relationship between the US, for example, and China, it would be US companies that suffer most,” he said. The COVID-19 wild card The fluidity of the coronavirus crisis gives weight to the idea that the global order could be affected by the virus in the long run. China may get a vaccine first and distribute it to the rest of the world, thereby increasing international perceptions that China has taken over the global leadership mantle from the US. There are also an assortment of political concerns surrounding the virus, especially in the US. If a viable treatment is found before the US presidential election, it could assuage economic uncertainty and impact American voters’ views of how Trump has handled the virus. For Stewart Patrick, a foreign policy expert at the Council on Foreign Relations, a US think tank, much of what happens may depend on what the US does. “The game [of international influence] is afoot, now as much as ever,” Patrick says. “Decades ago, after World War II, the United States wrote the rules to the game. Today, it whines from the sidelines. To compete once more-to say nothing of winning-the country needs to get back on the field, round up like minded teammates, and play hardball.” Have a tip or story? Get in touch with our reporters at tips@themilsource.com
https://themillennialsource.medium.com/will-covid-19-grow-chinas-influence-on-the-world-stage-9d9bd092a2c0
['The Millennial Source']
2020-05-25 13:39:23.235000+00:00
['Politics', 'USA', 'World', 'Covid 19', 'China']
AutoAI for Data Scientists: From Beginner to Expert
Data science is a required practice for organizations accelerating their journeys to AI. Businesses are keen on hiring the right talent, acquiring the right tools and evolving the discipline. When it comes to data science projects there are two major problems: 1) There are not enough data scientists. 2) It takes too much time for any data scientist to get to a usable, tuned model. Solving the lack of data scientists' problems requires investment in our employees in terms of time and training. We can’t expect these people to just keep on learning for a year before they can be productive. We need to reach a stage where people know enough to start contributing immediately while continuing to improve their skills. As far as the second problem is concerned, taking too much time getting to a usable and tuned model, we need tools to help us optimize our data scientists' productivity. There are some tasks that are relatively mundane that could be automated, leaving the more challenging and interesting parts to the data scientist. Intelligent automation in data science and AI empowers everyone Enter AutoAI. It recently won the AIconics best innovation in intelligence automation award. Let’s talk about how it addresses our problems. AIconic Award for AutoAI Currently, AutoAI addresses problems related to classification and prediction (regression). These types of problems are at the core of many data science initiatives. If you are an experienced data scientist, you know how to solve them. With AutoAI in Watson Studio, you can quickly see the leaderboard of the various pipelines which help accelerate the model selection. If you are learning data science you can learn how these functions are used. AutoAI processing and leaderboard At the highest level, creating a model involves taking some data, passing it through a machine learning algorithm, and getting a resulting model. Well, it’s not always that simple. Let’s say you have your data as a comma-delimited file (.csv). To start with, all the attributes are character strings. We need to identify all the fields that are numeric and convert them into integer, decimal or floating-point numbers. You also have to consider dealing with missing values and normalization. The character fields also have to be converted to numeric values. Typically, we are talking about categorization. For example, gender, type of payment, and so on. We must admit that this is not the most exciting part of creating a model. Being able to automate this part makes expert data scientists more efficient and helps more junior data scientists avoid mistakes while address the pre-processing of the data even if they are still learning about what needs to be done. See for yourselves: You can try the AutoAI tutorial on the IBM Cloud for free. Which algorithm to use? Do we use a decision tree? An ensemble? There are so many to choose from. Which one is the best for the type of data and problem we have? Curated models available through AutoAI We also have to contend with feature engineering and hyper-parameter tuning. Which new features should you create? Based on what? This takes experience to select the right mix. As for hyper-parameter tuning, this can be tricky. You could end up with a model that works great on training data but not so much on new data. You could also end up with a less than optimal model. AutoAI addresses all those issues and allows you to make an educated decision on which model performs best. Your decision is assisted with evaluation measures such as Root Mean Squared Error (RMSE), Mean Absolute Error (MAE), and others on both the training and testing data (including cross-validation). You can even see the details of how feature engineering was done and the feature importance. This is especially a key part for a beginner to start learning about data science. For expert data scientists, you can validate or adjust some of your assumptions here. Model evaluation Once you decide on the model to use, you can save and deploy it into an IBM Watson Machine Learning service so people can score their data through a simple REST API. Saving an AutoAI model A Perfect Blend of Open-source and IBM Technology Ah, this is a proprietary solution! Not at all! Instead of saving the model to an IBM Watson Machine Learning service, you can save it as a notebook. This way, you can generate the model yourself and decide where to save and deploy it. Since it is a notebook, you can modify it for any reason, may that be adding some transformation or make it fit datasets with additional attributes. And of course, you can use this with an open-source or Watson Studio based tool. Generated notebook One side benefit of generating a notebook could be for education and training. It is always instructive to see how things are done, and beginner data scientists may see some transformations they did not think about for this or other projects. This becomes learning by example. IBM is committed to leading and empowering the open-source community and data science is, of course, no exception! Giving you more time to innovate by minimizing mundane or repetitive tasks We stated that two important problems we want to solve are to make the beginner data scientist productive as soon as possible and remove some burdens from the experienced data scientist so they can be more productive. With AutoAI Experiments, we remove the burden of having to deal with all the details of preparing the data. This way, a beginner data scientist does not need to know all the intricacies of data preparation right away and the experienced data scientist does not need to spend her time on mundane tasks so she can focus on higher-value tasks. Since AutoAI can select the more appropriate model for classification or regression, automate feature engineering and hyper-parameters tuning, and provide measurements on the quality of models, data scientists can focus on the evaluation and selection of the model instead of the mechanics of creating one. Overall, AutoAI democratizes data science and AI — data preparation, model development and selection, execution and deployment. This addresses the shortage of data scientists and gets to a solution faster. By accelerating the data science lifecycle with AutoAI, businesses can focus more on high value-added work and innovative solutions. This is why we are focused on sharing the best practices and playbook in AI. The Future of Work Webinar in data science will be more exciting and dynamic I predict. Ready to learn more about AutoAI? Check out this Website where we built an AutoAI playlist of videos, product tours, and hands-on lab. Or, join us at our live 3-part Virtual Data Science Camp Fall Edition starting on October 31, 2019. You can view the Summer Edition of this popular 3-part series here. If you are interested in other IBM Watson Studio-related webinar, please read the following blog.
https://medium.com/ibm-watson/autoai-for-data-scientists-from-beginner-to-expert-cc6a93bb5c3b
['Jacques Roy']
2020-04-13 15:05:44.089000+00:00
['Machine Learning', 'Watson Studio', 'AI', 'Editorials', 'Data Science']
Are the Palestinian People threatening Israel’s health?
CARTOON: CARLOS LATUFF Are the Palestinian People threatening Israel’s health? The global spread of Covid-19 caused movements restrictions and full-on lockdowns in many places all over the world, including the area of Israel and Palestine. As an occupying force, Israel had always controlled every entry and exit in the West Bank and last week closed every area under Palestinian administration to allegedly curb the spread of the infected. It may be clear that Israel is using the pandemic to quicken the process of annexation of Palestinian land: as a matter of fact, Israeli settlers are allowed to attack Palestinian civilians, interfering with health authorities attempts to fight the contagion. The goal of these strict limitations could be the safeguard of public safety, but it would mean that the measures would be applied to Israeli citizens as well, since the virus infects both populations alike. That is not the case since Israeli settled in the West Bank, despite living only a couple hundred meters from the Palestinian communities in forced lockdown, are not subjected to the same rules. The imposition of these guidelines is a glimpse into what would have happened if the “Trump peace deal” had been implemented. A violent series of attacks towards Palestinians farmers is underway: one example is Bethlehem, where Palestinians are facing a harsh quarantine and thousands of trees belonging to them were cut down. Israel has also started to build in the immediate vicinity of Nablus, a blatant attempt to create ghettos and bantustans for Palestinians. Gaza, that has already registered a small number of covid-19 cases, caused stark preoccupations regarding the readiness of its healthcare system, already put under stress by the embargo forced by Israel. More than 5000 Palestinians, including women and children, are detained as we speak in Israeli prisons, known for being old, dirty, crowded and underserved from a hygiene perspective. Cells are claustrophobic, unclean and without windows, and on top of everything ex-prisoners testified that psychological and physical tortures are commonplace in there. In conclusion, the health of prisoners is completely overlooked. Palestinians are trapped and fighting on two sides: one battle is fought against the pandemic, the other against the brutal Israeli occupation, in which these people are the perfect scapegoat to blame for the Covid-19 spread in Israel. They are considered a threat to the health and lives of Jewish citizens, presented as a sort of “fifth column” and considered illegal aliens. The health sector has been since ’48 one of the most representative for Palestinians (despite many racist incidents happening in this system as well). 17% of doctors in Israel is Palestinian, and many nurses, pharmacists, technicians and ecological operators are as well. While in Israel a big part of the population is being tested, only 5% of Palestinians were guaranteed Covid-19 testing. On one side the Israeli government is promoting health precautions methods among Jewish citizens, on the other it’s not doing the same for Palestinians. Instructions haven’t been translated into Arab for weeks, and no investment has been made available to support the health infrastructures in Palestinians cities and villages. Palestinian people, like other native and discriminated people, are at a structural disadvantage when it comes to healthcare; this situation combined with a deadly pandemic can lead to devastating results. The average distance between Palestinians communities and the nearest hospitals is almost double the Israeli one, and the quality of healthcare facilities in Palestinian territories is lacking: as a result many people there suffer chronic diseases such as diabetes, high blood pressure and heart problems. For Palestinian Bedouins the threat is even greater. Around 150 000 Bedouins live in 40 villages not considered legitimate by the state and as such not granted running water, sanitary services and healthcare infrastructures. Despite the calls for actions made by Bedouin authorities, the State refused to do anything in regards of testing, social distancing enforcement and access to hospitals or medical care. Judging by these facts, Israeli response to Covid-19 could reveal indifference towards Palestinian lives. Coronavirus’s spread among Palestinians could also provide Israel with the opportunity to intensify control on them, maybe even coming to isolate them politically and physically. In the last few days in Jaffa, for example, Israeli police attacked Palestinians residents for allegedly violating social distancing norms, even going as far as throwing shock grenades. This could be only the beginning, and if the pandemic hits seriously Palestinians villages and cities, we could see their inhabitants rise against the lack of healthcare and as a reaction the Israel state militarize even more to react to the protests. This emergency could make the Palestinian population even more controlled, punished and militarily oppressed than It already is.
https://medium.com/@threeworldpills/are-the-palestinian-people-threatening-israels-health-2b052e398a3a
[]
2020-04-20 09:25:58.244000+00:00
['Virus', 'Coronavirus', 'Palestine', 'Israel']
These past months, I’ve learnt to prioritize reflection.
These past months, I’ve learnt to prioritize reflection. 👉🏿Why I do things. 👉🏿Reassessing convictions and notions I have held dear over the years. 👉🏿Where I’m at. 👉🏿Where I desire to be. 👉🏿 How far I have come. It’s not altogether perfect and it may never be. Heck, my life is not perfect 🤣 but I’m learning to trust God. I am learning that it is okay to not have everything figured out and just starting out anyways. (Sue Hastings, I love you🤣) Learning that it is okay to make mistakes rather than wait for perfect timings and situations. Learning to prioritize my convictions over what is expected of me. Learning to trust and believe the best of myself. Learning to enjoy the journey even as I reach for the desired destination. I didn’t get all the things I planned to do in Jan — May in the way I wanted them done but documenting what I have done with myself now and the seemingly little steps I have taken, there are things to be thankful for and to be celebrated that could have been overlooked easily. Today, I celebrate my strength and how far I’ve come. Today, I kick off posting my thoughts and reflection and I hope one or two things gets to resonate with you as you follow my journey, walking with God and seeing things through his eyes. June, 10x better. 🤍
https://medium.com/@mgkvkhnbr/these-past-months-ive-learnt-to-prioritize-reflection-21229a0d4edb
['Mitchelle Dorcas']
2021-06-01 06:14:21.691000+00:00
['Thoughts', 'Journaling', 'Walking With God', 'Writing', 'Reflections']
How to Talk About Your Ex in a New Relationship
The difference between vulnerability and victimhood After a relationship ends, we can struggle to find happiness in the ending. And all the hackneyed relationship advice will tell you to avoid the subject with new romantic interests. This advice certainly holds true for the first few dates. But eventually, everyone knows the topic will come up in some shape or form. Intimacy is built by revealing small truths. What is more important is not when you talk about your past relationships. It is how. Because that “how” can either reveal your vulnerability or your victimhood. And there is a difference. Vulnerability reaches for another and says — I am imperfect just like you. Vulnerability is the chips and dings that give a vintage chair its character. Victimhood communicates — my imperfections have broken me. I am a chair no one can sit on now. Perhaps you have not figured out yet how to talk about your heartache without sounding broken? Here are some ways to have that tough conversation. “True love always makes a man better, no matter what woman inspires it.” ― Alexandre Dumas Red pen the drama Ever watch a baby first learning how to walk? The baby falls and is startled. The first thing the baby does is look to its caregivers to see how he should react. If the parents run to the baby alarmed and fearful, the baby gets scared and starts crying. But if the parents smile and say, “Oopsie! Better get up and try again…” The baby laughs, gets up, and tries again. You might be tempted to add dramatic details to that painful breakup, but it is far better to describe the past in neutral terms. Be the baby who falls, giggles, and gets back up. Make a gratitude list. You are probably sick of hearing about the importance of gratitude, but studies show that it does rewire your brain for happiness. Start by making a list of the ways your past relationship made you stronger. It could be something as mundane as your ex improved your tennis serve or more life-changing like they redefined what you truly want in a partner. Take responsibility for the ending. I missed a huge red flag with my ex-boyfriend — he never once talked about his ex-girlfriends with any sense of responsibility. He referred to his last girlfriend, not by her name. He just called her “the horrible woman.” And all his other relationships ended through no fault of his own. You have probably heard platitudinous advice of — never bash your ex in a new relationship. But you can talk about your ex’s flaws as long as you describe them within the context of where you screwed up too and what you learned from it. Covering up the pain will only draw attention to it. Imagine you are giving a big presentation, and you are a tiny bit nervous. So nervous that before your talk, you spill your coffee on your lap. Just great, you think. Now I have to give my presentation with a gigantic crotch stain in the shape of an errant amoeba. So you strategically hang your hands in front of the stain while you give your presentation. Your audience is unaware that you have a coffee stain, so they just think you are a pervert groping yourself. Keeping your hands over your big crotch stain is a lot like not talking about a past hurt. You can try to cover it up, but the stain is still there. Now, what you could have done before your presentation is made a joke about your klutziness, pointed to the stain landing fortuitously on your crotch, got a big laugh from everyone, and moved on to your presentation. You just took your momentous screw up and made it humorous. You showed you were vulnerable. And nothing is more exquisitely human and undeniably sexy than vulnerability. You can turn any accident into a happy one.
https://psiloveyou.xyz/how-to-talk-about-your-ex-in-a-new-relationship-1b766fc293e9
['Carlyn Beccia']
2020-12-14 16:53:39.495000+00:00
['Self Improvement', 'Breakups', 'Life Lessons', 'Relationships', 'Love']
New in Request Invoicing: Support for FTM, RDN, INDA and improved UI
Hello Request enthusiasts, Happy holidays! We wish you all health and success as we approach the end of 2020 and the start of a new year. In the last few months over $500,000 worth of cryptocurrency has been processed through Request Invoicing. An obvious trend is that five currencies were the most used: DAI, USDC, USDT, ETH and BTC. We’ve taken this feedback into account and updated the currency selection UI* to make it easier to select these tokens to get paid in. *Don’t worry, the other tokens are still via the dropdown menu at the end of the list. Speaking of other tokens, we are happy to announce that we now support $FTM, $INDA and $RDN. As the adoption of cryptocurrency as a means of payment continues to grow, we look forward to supporting many new currencies in the future. Is there a token you’d like to get paid in that is not currently available? Let us know at hello@request.network. Also in this release: 🔢 Added the invoice number to the dashboard view 📧 Improved the email update flow 💬 Updated the copy and messaging 📛 Changed the PDF receipts file name 🔥 18 bug fixes and performance updates Ready to start getting paid in cryptocurrency? Sign up for Request Invoicing here. Now is better than ever before to take advantage of cryptocurrency for your business. Still on the fence? Grab a time in my calendar and let’s chat.
https://blog.request.network/new-in-request-invoicing-support-for-ftm-rdn-inda-and-improved-ui-dca1db285b13
[]
2020-12-21 15:00:23.753000+00:00
['Payments', 'Cryptocurrency', 'Invoice', 'Crypto', 'Blockchain']
Theory U, Prototyping: Integrating Past, Present, and Emerging Future
By integrating my past and my present I can offer a unique service to my stakeholders because I now act, not from what I need from the future, but from what the future needs from me. HB Six Prototyping Principles I listened carefully when Otto Scharmer explained the 6 Prototyping Principles in the third live session of “u.lab. Leading From the Emerging Future” 2016, which was about the right side of the U — Crystallizing and Prototyping. Crystallize Vision and Intention. You have to be aware of your connection to Source and lean into what is beginning to emerge from that connection. “Staying with” means to have patience, because you shouldn’t expect results immediately, allowing yourself to wait and not jumping to premature actions. Core Teams. Five people can change the world as Margaret Mead said. However, they are not connected only around their heads; they are connected to the energy of the heart. A core team is really the people who are holding the whole. There has to be a certain level of energy and commitment that you can feel together. There has to be competence, the members have to have knowledge of the subject they are going to work on. In u.lab you organize around what is emerging — what wasn’t known before. At some junction some people leave, others try to join. 0.8 Iterate, Iterate, Iterate. Move into action very quickly, learning by doing, learning from your mistakes. The core team is going to make mistakes and learn from them together. The feedback that you are getting leads you to the next step of your journey. Hold the space. There has to be a space (physical or virtual) where you meet and get together once in a while to experiment safely on your prototype. There is a core team and there is an extended team around that — a network team. Listening to the universe. It is not following your intuition. It is paying attention to all the voices that aren’t you, to anyone that you are interacting with. It is not your expectations, although sometimes, there are inner voices that we can notice. Generally, in U work, and especially in prototyping work, you have to go into the particulars over the acquisition of the general. Integrating the intelligence of Head, Heart, and Hand. Open Mind, Heart, and Will are moving from operating inside your own boundaries and reaching out to that intelligence that is outside of those boundaries. “The foundation for all prototyping is to connect your intention with the intelligence of the heart. You have to place that feature in your heart first.” Watching one of those live sessions is quite an experience in itself. Afterward, you go about developing your own prototype while trying to juggle all of those variables. Caracas u.lab Hub. Watching the Prototyping session 2016. The Leader’s Need for Integration One of the interviews that resonated the most with me when I was studying my first u.lab was Isabel Guerrero’s, who was in a life transition at the time, after having been VP of the World Bank for more than twenty years. She portraits in words the leader’s need for integrating the different parts of him/herself, underlying two key aspects: first, integration is a life journey, second, it is not a joyful voyage because to experience the satisfaction associated with change and growth, he/she has to face fear and pain. Here is an excerpt. I think life is about integrating different parts of yourself. And eventually finding your authentic self. And that authentic self has a lot of dimensions. The spiritual is one. For me, integrating the feminine and the masculine has been another. The masculine gave me the results, and the feminine the strength, the appreciation of beauty and the caring of others. Life is a long journey where you actually integrate different parts. To become a leader and be part of a transformation, you have to go through an integration. That is the leadership voyage for me. Like Gandhi said, “You have to be the change that you want to see in the world”, even though sometimes it seems difficult and you have to step out of your comfort zone and there is pain — because the first reaction people have when they leave the familiar, is sadness or fear. The first reaction is well, maybe I shouldn’t be here. And no, that’s the door. That’s the door through which you have to walk if you really want that new. Don’t get afraid of the pain. Don’t get afraid of the unknown. Step out of your comfort zone so you can continue growing. And this is a lifelong thing. Integrating My Past and Present With u.lab The Unintegrated Me Burned mySelf Out I developed a corporate career mainly in Finance, Change Leader, and Systems implementation, areas that demanded lots of effort from my part because my brain was not wired for that. Late in my life, I discovered that I had to work twice as hard as most of my colleagues because I had chosen a profession that required Analytical, Concrete, Convergent, and Sequential types of thinking. Whereas I am a natural Creative, Abstract, Divergent, and Holistic thinker. Nevertheless, that was my responsibility and I was successful at my job, doing it to the best of my ability. However, sustaining that effort for 25 years eventually sucked all energy out of me, spiritually and physically because the person who was facing the day to day responsibilities of my job, was not my real Self, who remained hidden in the shadows. At the end of my career, at age 55, I was emotionally depleted and physically sick due to the stress I had to endure for so long. Slowly, but surely I had burned myself out. Contrarily, my family life had been an oasis of love and peace. It was that space of love and understanding which eventually allowed me to recover and to start again in another field, but that is another story. Finding Meaning I have a product! I thought to myself when I was leaving the house of my coach and friend Santiago José Porras. He had been a key factor in my concreting a 48 hours part-time course with virtual support that integrated “Theory U” and the “Well-Being Theory” of Positive Psychology. I had visualized the prototype during u.lab 2015. However, I didn’t complete it in the duration of the course. As a matter of fact, that prototype took me a whole year to develop from the time I came up with the idea to the moment the final product came about. I could hardly believe my journey up to that moment. After ten years in search of meaning in my life, I studied a Diplomate in Positive Psychology in 2013. Afterward, I obtained a Certificate of Ontological Coach after finishing a 7-month program in August of 2015. Nevertheless, I sensed that my formation had not finished yet, and due to a serendipitous course of events, I stumbled upon u.lab: Leading from the Emerging Future, a course of MIT’s Sloan School of Management that his creator, Otto Scharmer, had turned into a MOOC in January of that year. It was love at first sight. At 65 years of age, this had been my longest study streak since I got my MBA in 1980. What kind of force pushed me to do that? For the first time in my life, I loved what I was doing and I was feeling passionate about it, and that feeling kept me riding the wave. That effort was my maneuver to shift my life focus to the areas about which I felt more passionate. Awareness What? u.lab is a highly experiential course that capacitates change agents with Awareness Based Systems Change methods and tools to bring change, first, in the leader within you, and second, in the organization or the institution of which you are part. At u.lab, I felt for the first time in my life that I was doing what I had been wired for. I flowed so effortlessly through the U process that I could not believe it. I was being myself for the first time in my professional life, so much that now I smile when I picture an imaginary conversation between my past colleagues and me: — Helio, what do you do for a living? — Awareness Based Systems Change!, and after a silence, — Awareness what? Integrating mySelf Thanks to a Coaching Circle I wanted to develop these new capacities at their full potential, but I did not realize that I was relegating my corporate past as a bad dream — leaving out 25 years of professional experience. My intention for the program that I created later, was to facilitate it to individuals, workgroups in corporations, and social organizations. New ideas came to me all the time and I had them juggling in my mind, motivating me to continue advancing in that direction. I thought in my mind that I had all the pieces together and that all I had to do was to order them coherently. However, I had the feeling that something was missing, but I could not figure out what was it. La Estancia Coaching Circle I continued working on my project after the course ended; constantly facing my Voices of Judgement, Cynicism, and Fear, but fortunately, I had a space of trust and understanding in which I met with people facing challenges like my own. That space was the Coaching Circle, which continued to meet up to six months after the course had ended. In one of those meetings, I had my AHA! Moment. We were doing a Canvas analysis for one of the participants and without me realizing it, my corporate past resurfaced in the form of a business strategy for that startup that revealed itself clearly to me, it just emerged before my eyes. At that moment, I realized that the element missing from my project was my 25 years of business experience. I had been so impressed with these new capacities that somehow I had neglected to integrate my experience leading change in organizations into it. At that moment I realized that during 25 years I had unknowingly been developing my Analytical, Concrete, Convergent, and Sequential thinking capacities. The time to integrate had finally arrived. The “safe space” held by the Coaching Circle allowed mySelf to slowly emerge, letting go of the old trauma associated with my past burnout, keeping the essence of my experience, blending it with my new competencies, and letting that blend mature until its time had come. When I realized that I was integrating my passion and knowledge with my past experience, a whole new world of possibilities opened up before me. I felt energetic and optimistic about what I was doing because independently of my country’s ongoing crisis, I somehow believe that my future is in my hands. By integrating my past and my present I can offer a unique service to my stakeholders because I now act, not from what I need from the future, but from what the future needs from me. The Road Less Traveled U School for Transformation Annual Cycle. Image: Presencing Institute After 5 years of immersing myself in the year-round Theory U cycle, first as a curious learner, and later as a facilitator, change-maker, and passionate activist, I can testify that developing prototypes might be a long and winding road. Nonetheless, every minute that you invest in that process is worth it because it is going to direct you to a path of meaning and achievement, as Robert Frost said, “…I shall be telling this with a sigh Somewhere ages and ages hence: Two roads diverged in a wood, and I — I took the one less traveled by, And that has made all the difference”. You are invited to register in the life-changing program, “u.lab. Leading from the Emerging Future”, it is free and self-paced. In addition, if you are interested in deepening and enriching the u.lab learning and growth experience for yourself or your team, please contact me at helio.borges.consulting@gmail.com to have a generative conversation. This is a two articles series, the second one, “Theory U, Co-Evolving: The Prototype is U” will soon be published.
https://medium.com/presencing-institute-blog/theory-u-prototyping-integrating-past-present-and-emerging-future-370bd1383aef
['Helio Borges']
2020-11-25 14:41:20.554000+00:00
['Design Thinking', 'Theory U', 'Prototyping', 'Innovation', 'Otto Scharmer']
Review of Medium’s New Publication Features and Writer Profile Redesign
A few months ago, Medium announced that they were making Medium “more expressive” and that several new features would be implemented. Chief among these changes was the launching of controls around color, headers, type, and branding so that you can personalize your space on Medium. As part of this, Medium invited a number of writers to sign up for a private (and then public) beta test which allowed writers to control the layout and visual appearance of their publication. I opted to test 2 of my publications in the early private beta program — Digital Marketing Lab and Escaping The 9–5. After testing the new features for about 4 months, and having already given feedback to Medium, I wanted to write a review summarizing what the changes actually consisted of and whether I preferred the new publication and writer profile redesign.
https://medium.com/blogging-guide/review-of-mediums-new-publication-features-and-writer-profile-redesign-bd71824e670
['Medium Formatting']
2020-12-08 04:08:07.021000+00:00
['Blogging Tips', 'Publication', 'Medium Partner Program', 'Writing Tips', 'Medium']
Legalising LGBT rights in the UK and beyond
Legalising LGBT rights in the UK and beyond It’s not all about marriage equality. The queer emancipation movement has had more dealings with the courts than decriminalising homosexuality and the legalising same-sex marriage alone. After decriminalisation came the equal age of consent. It took a 1997 ruling by the European Commission of Human Rights to confirm that the UK’s unequal age of consent was a violation of the European Convention of Human Rights, a wrong which wasn’t corrected until 2000. In 2002, the UK parliament amended the adoption laws of England and Wales, removing the provision that required only married couples may adopt. While not specifically a queer-rights issue, the change still allowed LGBT individuals and couples to apply for adoption for the first time. Scotland introduced similar legislation in 2009, and the U.S. has had related laws in every state except Mississippi since 2011, but elsewhere the issue remains contentious. The British military has allowed LGB troops to openly serve since 2000, although it didn’t prohibit discrimination on grounds of sexual orientation for another ten years. Trans personnel are also allowed to serve openly, a matter which has still not been resolved in the U.S. military. In 2008, representatives from all three major branches of the British military — army, navy, and air force — were allowed to march in uniform at London Pride for the first time. All three services now actively recruit at Pride events. Section 28, a notorious provision to the Local Government Act 1986 which banned the “promotion” of homosexuality in schools, was repealed in 2002. Under Section 28, a teacher couldn’t so much as comfort a child confused about their sexuality by saying it was okay to be gay. In 2009, Prime Minister David Cameron formally apologised for the introduction of the law by his party, calling it a “mistake” and “offensive.” Some LGBT discrimination protection has been law in the UK since 1999. In 2003, workplace discrimination was prohibited. In 2007, the Sexual Orientation Regulations forbade discrimination in the providing of goods and services. In October 2007, the government announced plans to introduce a bill prohibiting the incitement of violence based on sexual orientation or gender identity. This was realised in the Equality Act 2010, which overwrote the array of earlier laws providing legal protections to queerfolk. Great strides have been made in recent decades, but there is much still to be done. The laws pertaining to transfolk are still murky in places (and generally took longer to implement than for LGBfolk, but that’s a whole other blog post), and those pertaining to minors are still not perfect (the British government is currently looking into the matter of “conversion therapy,” a recent import from the U.S.). Nonetheless, in a recent Gallup survey, LGBT residents voted the UK one of the top ten places to live.
https://medium.com/gay-old-times/legalising-lgbt-rights-in-the-uk-and-beyond-2a7a738f3893
['Kate Aaron']
2020-08-08 21:07:11.430000+00:00
['Gay Rights', 'Queer', 'LGBT', 'LGBTQ', 'LGBT Rights']
The most effective method to Rest soundly And Wake Up Brimming with Energy
An in-depth look at mastering your sleep. Photograph by Alena Ganzhela on Unsplash One of those days once more: Your caution goes off, you open your eyes, and all you need is cuddling once more into your sheets and rest a couple of more minutes. In any case, obligation calls, so you get up, wash your face, and snatch your first mug of espresso to make up for the absence of recuperation from the night. What follows are long stretches of working in programmed mode just to traverse the morning before you really feel alert. Site: https://johnsoncountytaxoffice.org/udx/Capella-v-Stakes-cast01.html https://johnsoncountytaxoffice.org/udx/Capella-v-Stakes-cast02.html https://johnsoncountytaxoffice.org/udx/Capella-v-Stakes-cast03.html https://johnsoncountytaxoffice.org/udx/Videos-Jp-horse-v-j1.html https://johnsoncountytaxoffice.org/udx/Videos-Jp-horse-v-j2.html https://johnsoncountytaxoffice.org/udx/Videos-Jp-horse-v-j3.html https://johnsoncountytaxoffice.org/udx/Videos-Kobe-v-UH-01.html https://johnsoncountytaxoffice.org/udx/Videos-Kobe-v-UH-02.html https://johnsoncountytaxoffice.org/udx/Videos-Kobe-v-UH-03.html https://johnsoncountytaxoffice.org/udx/Videos-Kobe-v-UH-04.html https://johnsoncountytaxoffice.org/udx/Videos-Kobe-v-UH-05.html https://johnsoncountytaxoffice.org/odz/Video-Figue-v-Morno-tv-es01.html https://johnsoncountytaxoffice.org/odz/Video-Figue-v-Morno-tv-es02.html https://johnsoncountytaxoffice.org/odz/Video-Figue-v-Morno-tv-es03.html https://johnsoncountytaxoffice.org/odz/Video-Figue-v-Morno-tv-es04.html https://johnsoncountytaxoffice.org/odz/Video-Figue-v-Morno-tv-es05.html https://johnsoncountytaxoffice.org/odz/Video-UFC-v-256-liv-01.html https://johnsoncountytaxoffice.org/odz/Video-UFC-v-256-liv-02.html https://johnsoncountytaxoffice.org/odz/Video-UFC-v-256-liv-03.html https://johnsoncountytaxoffice.org/odz/Video-UFC-v-256-liv-04.html https://johnsoncountytaxoffice.org/odz/Video-UFC-v-256-liv-05.html http://www.senderosazules.org/sites/default/files/webform/Videos-Kobe-v-UH-01.html http://www.senderosazules.org/sites/default/files/webform/Videos-Kobe-v-UH-02.html http://www.senderosazules.org/sites/default/files/webform/Videos-Kobe-v-UH-03.html http://www.senderosazules.org/sites/default/files/webform/Videos-Kobe-v-UH-04.html http://www.senderosazules.org/sites/default/files/webform/Videos-Kobe-v-UH-05.html http://www.senderosazules.org/sites/default/files/webform/Video-UFC-v-256-liv-01.html http://www.senderosazules.org/sites/default/files/webform/Video-UFC-v-256-liv-02.html http://www.senderosazules.org/sites/default/files/webform/Video-UFC-v-256-liv-03.html http://www.senderosazules.org/sites/default/files/webform/Video-UFC-v-256-liv-04.html http://www.senderosazules.org/sites/default/files/webform/Video-UFC-v-256-liv-05.html http://www.senderosazules.org/sites/default/files/webform/Video-Figue-v-Morno-tv-es01.html http://www.senderosazules.org/sites/default/files/webform/Video-Figue-v-Morno-tv-es02.html http://www.senderosazules.org/sites/default/files/webform/Video-Figue-v-Morno-tv-es03.html http://www.senderosazules.org/sites/default/files/webform/Video-Figue-v-Morno-tv-es04.html http://www.senderosazules.org/sites/default/files/webform/Video-Figue-v-Morno-tv-es05.html https://johnsoncountytaxoffice.org/sbn/Hor-v-Rap-liv-snf-01.html https://johnsoncountytaxoffice.org/sbn/Hor-v-Rap-liv-snf-02.html https://johnsoncountytaxoffice.org/sbn/Hor-v-Rap-liv-snf-03.html https://johnsoncountytaxoffice.org/sbn/Hor-v-Rap-liv-snf-04.html https://johnsoncountytaxoffice.org/sbn/Hor-v-Rap-liv-snf-05.html https://johnsoncountytaxoffice.org/sbn/Cav-v-Pac-ntv-091.html https://johnsoncountytaxoffice.org/sbn/Cav-v-Pac-ntv-092.html https://johnsoncountytaxoffice.org/sbn/Cav-v-Pac-ntv-093.html https://johnsoncountytaxoffice.org/sbn/Cav-v-Pac-ntv-094.html https://johnsoncountytaxoffice.org/sbn/Cav-v-Pac-ntv-095.html https://johnsoncountytaxoffice.org/sbn/Spu-v-Thu-liv-cbn-01.html https://johnsoncountytaxoffice.org/sbn/Spu-v-Thu-liv-cbn-02.html https://johnsoncountytaxoffice.org/sbn/Spu-v-Thu-liv-cbn-03.html https://johnsoncountytaxoffice.org/sbn/Spu-v-Thu-liv-cbn-04.html https://johnsoncountytaxoffice.org/sbn/Spu-v-Thu-liv-cbn-05.html https://johnsoncountytaxoffice.org/sbn/Buc-v-Mav-btvchd-061.html https://johnsoncountytaxoffice.org/sbn/Buc-v-Mav-btvchd-062.html https://johnsoncountytaxoffice.org/sbn/Buc-v-Mav-btvchd-063.html https://johnsoncountytaxoffice.org/sbn/Buc-v-Mav-btvchd-064.html https://johnsoncountytaxoffice.org/sbn/Buc-v-Mav-btvchd-065.html https://johnsoncountytaxoffice.org/sbn/Tmb-v-Grz-sbn-01.html https://johnsoncountytaxoffice.org/sbn/Tmb-v-Grz-sbn-02.html https://johnsoncountytaxoffice.org/sbn/Tmb-v-Grz-sbn-03.html https://johnsoncountytaxoffice.org/sbn/Tmb-v-Grz-sbn-04.html https://johnsoncountytaxoffice.org/sbn/Tmb-v-Grz-sbn-05.html https://johnsoncountytaxoffice.org/sbn/War-v-Nug-liv-nba-01.html https://johnsoncountytaxoffice.org/sbn/War-v-Nug-liv-nba-02.html https://johnsoncountytaxoffice.org/sbn/War-v-Nug-liv-nba-03.html https://johnsoncountytaxoffice.org/sbn/War-v-Nug-liv-nba-04.html https://johnsoncountytaxoffice.org/sbn/War-v-Nug-liv-nba-05.html https://johnsoncountytaxoffice.org/sbn/Jaz-v-Sun-liv-5uc-01.html https://johnsoncountytaxoffice.org/sbn/Jaz-v-Sun-liv-5uc-02.html https://johnsoncountytaxoffice.org/sbn/Jaz-v-Sun-liv-5uc-03.html https://johnsoncountytaxoffice.org/sbn/Jaz-v-Sun-liv-5uc-04.html https://johnsoncountytaxoffice.org/sbn/Jaz-v-Sun-liv-5uc-05.html https://shelbycounty.iowa.gov/guk/Hor-v-Rap-liv-snf-01.html https://shelbycounty.iowa.gov/guk/Hor-v-Rap-liv-snf-02.html https://shelbycounty.iowa.gov/guk/Hor-v-Rap-liv-snf-03.html https://shelbycounty.iowa.gov/guk/Hor-v-Rap-liv-snf-04.html https://shelbycounty.iowa.gov/guk/Hor-v-Rap-liv-snf-05.html https://shelbycounty.iowa.gov/guk/Cav-v-Pac-ntv-091.html https://shelbycounty.iowa.gov/guk/Cav-v-Pac-ntv-092.html https://shelbycounty.iowa.gov/guk/Cav-v-Pac-ntv-093.html https://shelbycounty.iowa.gov/guk/Cav-v-Pac-ntv-094.html https://shelbycounty.iowa.gov/guk/Cav-v-Pac-ntv-095.html https://shelbycounty.iowa.gov/guk/Spu-v-Thu-liv-cbn-01.html https://shelbycounty.iowa.gov/guk/Spu-v-Thu-liv-cbn-02.html https://shelbycounty.iowa.gov/guk/Spu-v-Thu-liv-cbn-03.html https://shelbycounty.iowa.gov/guk/Spu-v-Thu-liv-cbn-04.html https://shelbycounty.iowa.gov/guk/Spu-v-Thu-liv-cbn-05.html https://shelbycounty.iowa.gov/guk/Buc-v-Mav-btvchd-061.html https://shelbycounty.iowa.gov/guk/Buc-v-Mav-btvchd-062.html https://shelbycounty.iowa.gov/guk/Buc-v-Mav-btvchd-063.html https://shelbycounty.iowa.gov/guk/Buc-v-Mav-btvchd-064.html https://shelbycounty.iowa.gov/guk/Buc-v-Mav-btvchd-065.html https://shelbycounty.iowa.gov/guk/Tmb-v-Grz-sbn-01.html https://shelbycounty.iowa.gov/guk/Tmb-v-Grz-sbn-02.html https://shelbycounty.iowa.gov/guk/Tmb-v-Grz-sbn-03.html https://shelbycounty.iowa.gov/guk/Tmb-v-Grz-sbn-04.html https://shelbycounty.iowa.gov/guk/Tmb-v-Grz-sbn-05.html https://shelbycounty.iowa.gov/guk/War-v-Nug-liv-nba-01.html https://shelbycounty.iowa.gov/guk/War-v-Nug-liv-nba-02.html https://shelbycounty.iowa.gov/guk/War-v-Nug-liv-nba-03.html https://shelbycounty.iowa.gov/guk/War-v-Nug-liv-nba-04.html https://shelbycounty.iowa.gov/guk/War-v-Nug-liv-nba-05.html https://shelbycounty.iowa.gov/guk/Jaz-v-Sun-liv-5uc-01.html https://shelbycounty.iowa.gov/guk/Jaz-v-Sun-liv-5uc-02.html https://shelbycounty.iowa.gov/guk/Jaz-v-Sun-liv-5uc-03.html https://shelbycounty.iowa.gov/guk/Jaz-v-Sun-liv-5uc-04.html https://shelbycounty.iowa.gov/guk/Jaz-v-Sun-liv-5uc-05.html
https://medium.com/@guro_5784/the-most-effective-method-to-rest-soundly-and-wake-up-brimming-with-energy-d6dceba051ed
[]
2020-12-12 23:40:03.477000+00:00
['The Most Important Thing', 'Off', 'Effective', 'Coffee']
iOS Generic Search
In this tutorial we are going to implement a search that returns search results that will be used as filters. It can be very useful when narrowing or filtering down a list of content by applying multiple filters (chips). This is a common pattern seen when implementing Search. … Let’s dive right in… The idea is to show a list of filters when the user starts typing in the search bar. A filter will consist of some matching text and a category. When the user taps on a search result we will create another chip as a filter. We will keep track of these with a SearchResult. A SearchResult contains the information we need later when applying them as filters to our list. protocol SearchResult { var searchText: String { get set } var matchingText: String { get set } var keyPath: AnyKeyPath { get set } var categoryName: String { get } init(searchText: String, matchingText: String, keyPath: AnyKeyPath) } One little annoying thing is we have to wrap our SearchResult with another protocol. Otherwise later we get this compile-time error when we go to implement our concrete search: Protocol ‘SearchResult’ can only be used as a generic constraint because it has Self or associated type requirements protocol UniqueSearchResult: Hashable, SearchResult { } Lets back our Generic Search with a protocol. This makes testing easier. protocol SearchDefinition { func linearSearch<Content, SearchResult: UniqueSearchResult (content: [Content], searchString: String, keyPaths: [AnyKeyPath], resultType: SearchResult.Type, completion: @escaping (_ results: [SearchResult]) -> Void) } Here is the implementation of our SearchDefinition. We pass in an array of Content we want to be searched, the search string, the properties we want to search on that Content, and the result type. struct GenericSearch: SearchDefinition { func linearSearch<Content, SearchResult: UniqueSearchResult>. (content: [Content], searchString: String, keyPaths: [AnyKeyPath], resultType: SearchResult.Type, completion: @escaping ([SearchResult]) -> Void) { guard searchString.count > 0 else { completion([]) return } let searchStringLowercased = searchString.lowercased() var results: Set<SearchResult> = Set<SearchResult>() content.forEach { itemToSearch in keyPaths.forEach { prop in if let itemToSearch = itemToSearch[keyPath: prop] as? String, itemToSearch.lowercased().contains(searchStringLowercased) { let result = SearchResult.init(searchText: searchString, matchingText: itemToSearch, keyPath: prop) results.insert(result) } } } var resultsArray = Array(results) resultsArray.sort { guard let first = $0.matchingText.lowercased().range(of: searchStringLowercased)?.lowerBound, let second = $1.matchingText.lowercased().range(of: searchStringLowercased)?.lowerBound else { return true } return first < second } completion(resultsArray) } } Now that we have our Generic Search we can create specific searches that specify the type of Content we want to perform the search on and the properties to search on that Content. In this example we will use a list of Items that you can purchase: struct Item { var name: String var category: Category var price: Double var stores: [Store] } struct Store { var name: String var categories: [Category] } enum Category: String { case toys case clothes case electronics case housing case jewelry } Next we will wrap our generic search with a concrete search class. This will define our result type as well as all the keypaths we want to search within our content. protocol SearchService: class { func search<Content>(content: [Content], with string: String, completion: @escaping (_ results: [SearchResult]) -> Void) } class ItemSearch: SearchService { private lazy var search: GenericSearch = { GenericSearch() }() func search<Content>(content: [Content], with string: String, completion: @escaping (_ results: [SearchResult]) -> Void) { search.linearSearch(content: content, searchString: string, keyPaths: [\Item.name, \Item.category.rawValue], resultType: ItemSearchResult.self) { results in completion(results) } } } struct ItemSearchResult: UniqueSearchResult { var searchText: String var matchingText: String var keyPath: AnyKeyPath var categoryName: String { switch keyPath { case \Item.name: return "Item" case \Item.category.rawValue: return "Category" default: return "" } } init(searchText: String, matchingText: String, keyPath: AnyKeyPath) { self.searchText = searchText self.matchingText = matchingText self.keyPath = keyPath } } Now let’s use our Search: class SearchViewController: UITableViewController { private var searchController: UISearchController! private let search = ItemSearch() weak var dataSource: SearchDataSource? weak var delegate: SearchResultsDelegate? ... } extension SearchViewController: UISearchResultsUpdating { func updateSearchResults(for searchController: UISearchController) { guard let searchString = searchController.searchBar.text, let dataSource = self.dataSource else { return } self.search.search(content: dataSource.items, with: searchString, completion: { [weak self] results in /// Update Chips and apply filters aka results self.delegate.reload(searchResults: results) }) } } This all works great! If and only if the properties we are including in our keyPaths are Strings. If we want to search inside an array or another type that contains an array we need to do a little more work. Lets update our SearchResult by replacing our KeyPath with a recursive enum: indirect enum SearchPath { case path(currentLevel: AnyKeyPath, nestedLevel: SearchPath?) var currentLevel: AnyKeyPath { switch self { case .path(let currentLevel, _): return currentLevel } } var nestedLevel: SearchPath? { switch self { case .path(_, let nestedLevel): return nestedLevel } } } extension SearchPath: Hashable { static func == (lhs: SearchPath, rhs: SearchPath) -> Bool { return lhs.currentLevel == rhs.currentLevel && lhs.nestedLevel == rhs.nestedLevel } } protocol SearchResult { var searchText: String { get set } var matchingText: String { get set } var searchPath: SearchPath { get set } var categoryName: String { get } init(searchText: String, matchingText: String, searchPath: SearchPath) } Above we used an indirect enum which allows us to have nested key paths! Shout out to my co-worker Ben Hakes for suggesting the indirect enum. Now we can update our Generic Search with our new SearchPath: protocol SearchDefinition { func search<Content, SearchResult: UniqueSearchResult>(content: [Content], searchString: String, searchPaths: [SearchPath], resultType: SearchResult.Type, completion: @escaping (_ results: [SearchResult]) -> Void) } struct GenericSearch: SearchDefinition { func search<Content, SearchResult: UniqueSearchResult>(content: [Content], searchString: String, searchPaths: [SearchPath], resultType: SearchResult.Type, completion: @escaping ([SearchResult]) -> Void) { if searchString == "" { completion([]) return } let searchStringLowercased = searchString.lowercased() var results: Set<SearchResult> = Set<SearchResult>() content.forEach { itemToSearch in searchPaths.forEach { prop in self.search(itemToSearch: itemToSearch, searchString: searchString, originalSearchPath: prop, searchPath: prop, results: &results) } } var resultsArray = Array(results) resultsArray.sort { guard let first = $0.matchingText.lowercased().range(of: searchStringLowercased)?.lowerBound, let second = $1.matchingText.lowercased().range(of: searchStringLowercased)?.lowerBound else { return true } return first < second } completion(resultsArray) } private func search<Content, SearchResult: UniqueSearchResult> (itemToSearch: Content, searchString: String, originalSearchPath: SearchPath, searchPath: SearchPath, results: inout Set<SearchResult>) { let searchStringLowercased = searchString.lowercased() guard let nestedLevel = searchPath.nestedLevel else { if let itemToSearch = itemToSearch[keyPath: searchPath.currentLevel] as? String, itemToSearch.lowercased().contains(searchStringLowercased){ let result = SearchResult.init(searchText: searchString, matchingText: itemToSearch, searchPath: originalSearchPath) results.insert(result) } return } guard let nextItems = itemToSearch[keyPath: searchPath.currentLevel] as? [Any] else { return } nextItems.forEach { nextItemToSearch in return search(itemToSearch: nextItemToSearch, searchString: searchString, originalSearchPath: originalSearchPath, searchPath: nestedLevel, results: &results) } } } We will need new filter logic to handle the nested SearchPaths when applying our results (chips): protocol FilterService: class { func apply<Content>(filter: SearchResult, to content: [Content]) -> [Content] } class TextFilter: FilterService { typealias SearchFilter = SearchResult func apply<Content>(filter: SearchFilter, to content: [Content]) -> [Content] { let filteredContent = content.filter { item in return applyFilter(path: filter.searchPath, value: filter.matchingText.lowercased(), itemToSearch: item) } return filteredContent } private func applyFilter<Content>(path: SearchPath, value: String, itemToSearch: Content) -> Bool { guard let nestedLevel = path.nestedLevel else { if let itemToSearch = itemToSearch[keyPath: path.currentLevel] as? String, itemToSearch.lowercased() == value { return true } return false } guard let nextItems = itemToSearch[keyPath: path.currentLevel] as? [Any] else { return false } return nextItems.contains { nextItemToSearch -> Bool in return applyFilter(path: nestedLevel, value: value, itemToSearch: nextItemToSearch) } } } Now we handle nested structures like ‘Item’ above. We have everything we need to search through stores. We can update our ItemSearch to the following: class ItemSearch: SearchService { private lazy var search: GenericSearch = { GenericSearch() }() func search<Content>(content: [Content], with string: String, completion: @escaping (_ results: [SearchResult]) -> Void) { search.linearSearch(content: content, searchString: string, keyPaths: [.path(currentLevel: \Item.name, nestedLevel: nil), .path(currentLevel: \Item.category.rawValue, nestedLevel: nil), .path(currentLevel: \Item.stores, nestedLevel: .path(currentLevel: \Store.name, nestedLevel: nil))], resultType: ItemSearchResult.self) { results in completion(results) } } } As you can see in the above example we can access collections via our new SearchPath enum: .path(currentLevel: AnyKeyPath, nestedLevel: SearchPath?) And that's it! We now have a generic, reusable Search that can be used to create multiple filters on a list. Simply wrap the Generic Search with your own concrete struct or class and define the keypaths and SearchResult for your data. Please feel free to take this and modify it to fit your needs in your next project.
https://medium.com/swlh/ios-generic-search-5e3c29589708
['Daniel Sedam']
2020-10-27 18:45:25.981000+00:00
['iOS', 'Generics', 'Swift', 'Keypath', 'Search']
What Would an Eternal Afterlife Feel Like?
What Would an Eternal Afterlife Feel Like? Using mathematics to gain some insight into the afterlife. Certain people who argue against the existence of an eternal afterlife like to argue that it would feel like “hell” or that we cannot even imagine it. I disagree. There are ways that we can guess what it would be like, and to find a clue to this question, all one needs is a little bit of calculus. While it might be hard to imagine certain aspects of an afterlife, such as what it would literally feel like, if we do not have bodies, the eternity question is fairly easy to address. Consider our own lifetime to start. As a child, summer vacation used to last forever. Or at least it felt like it did to me. By the time high school rolled around, it did not feel anywhere near as long. As we get older, it seems that the days, weeks, months, and even years just fly by, and on longer timescales, this sense feeling has been identified as being fairly common as people age (Scientific American). This change in view makes sense as we experience longer periods of time. If we can extend this feeling to eternity, we can actually estimate how long eternity would seem. Of course, the actual result depends on a few assumptions. But consider our one lifetime as a start. Suppose each equivalent amount of time from then on feels like 90% of the last amount. As an example, suppose we lived to 100 and the first 100 years of our afterlife felt like 90 years, our second 100 felt like 81 years, and so on. Even though we would still be living forever, the sum 100 + 90 + 81 + … + 0.9^n * 100 + … has a finite sum. This type of series is called a geometric series, and the infinite sum turns out to be 1000, or 10 lifetimes. The exact length depends on the fraction by which each unit of time feels shortened. There are other progressions which are also finite, and there are progressions which do not converge to a finite sum. But if we can extend our experiences in life to our experience in an afterlife, it does seem like eternity would not feel like it would last forever. Therefore any argument that rejects an eternal afterlife based on the notion of a torturous never ending existence can be seen as being too limited to hold water. One would first have to show that our experience in the afterlife is not modeled by this diminishing perceived-time phenomenon. There is one other complication to this analysis. The discussion in Scientific American suggests that shorter periods of time still feel like they go by at their normal pace. So each minute, hour, day, and even year, in the afterlife would, if our perceptions remain the same as they do here in this life, go by at roughly the same pace. We’d experience the passing of each year in roughly the same way, but it would seem like overall, decades, centuries, even eons, would go by in a blink of an eye, until even insurmountable periods of time seemed like a near instant. Does Heaven Have Scotch? On an almost entirely unrelated area of discussion, I hope that if there is an afterlife, it has the things that I enjoy. I once had a rather interesting discussion with someone who was on Facebook, trying to proselytize, and explain how great heaven was and why we should work to reach it. So I asked him, “does heaven have scotch?” Apparently in his mind, heaven doesn’t have alcohol at all! I was quite disappointed. He said that heaven does have roads of gold. That’s nice. I can’t drink gold. I want scotch. So I told him “thank you, but no thank you.” It’s interesting what some people consider important in the afterlife, vs what some people find absolutely trivial. < Religion Index|
https://medium.com/the-spiritual-anthropologist/what-would-an-eternal-afterlife-feel-like-b387f4791161
['Daniel Goldman']
2019-06-03 11:19:31.926000+00:00
['Religion', 'Anthropology', 'Culture', 'Philosophy']
I would read thousands articles like this, I so much enjoy neologisms and linguistic research.
I would read thousands articles like this, I so much enjoy neologisms and linguistic research. I personally love all the terms and idioms that came out of Zoom or of the praxis of social events happening online. In Italian we got aperivideo which substituted aperitivo (happy hour turned into an happy videoconference). Another really cool thing are the words that became of common usage during the pandemics and got to be part of humor idioms. For instance, in the first Italian lockdown people were obliged to stay at home — leaving was punished as a crime. If you needed to move, you had to fill out a form, declaring you were moving for reasons the government enacted as justifications. After that period people got used to say stuff like “Do you have a self-certification?” whenever someone is explaining something utterly complex, implying that you need to see if they are “authorized” to do that. I love how the words you hear everyday on mass media get to be a part of pop culture. Thank you for sharing, I’ll follow to know the work of the year!
https://medium.com/@gettingto/i-would-read-thousands-articles-like-this-i-so-much-enjoy-neologisms-and-linguistic-research-bd3716b52390
[]
2020-12-11 07:36:10.382000+00:00
['Words', 'Pop Culture', 'Linguistics', 'Covid Diaries']
All You Need to Know about Water Mitigation — Brian Marshall Florida
Floods or any water-related disaster can cause knock-on effects. It damages not only the occurrence of your home or business but also deeply damages one’s emotional wellbeing. We often do not keep the little things in check and rely on temporary notions for the long-term. Until the irreversible damage is done, and regrets hit us hard. Why not stay in caution and be prepared for such unpredictable calamities. Preaching ‘Prevention is better than cure’ let us have a check on the basics. Floods, ruptured pipes, leaky pipes, or roofs can be immediate and drastic. The first step is to determine the damage. Here’s when water mitigation walks in. Water mitigation refers to the repairing and restoring of the damaged property. There are excellent Water Mitigation services in Tampa that provide you with high-end experts. Not to mention, they help you understand the situation meticulously. What Is Water Mitigation? Water mitigation is the process of making repairs after water damage and prevents the potential for further damage. It reduces and prevents any water-related damage such as leaky roofs or plumbing. It looks after the maintenance of the property and restores the damaged property with adept techniques. Water mitigation services evaluate the damage and accordingly perform the course of action. They use different techniques and procedures for divergent categories of water problems. With leaky pipes, bacterial growth may begin in no time. Bacterial growth causes severe problems leading to parthenogenesis. There are various categories for water damage which can slowly extend over a period. Hence, avail to such services in Tampa who prioritize your need more than anything. Trained professionals do mitigations with professional equipment What Causes Water Damage? Water damage can occur due to numerous reasons. To name a few, there are — plumbing leaks, broken water piper or heaters, clogged toilets, leaking roof and many more. Foundation cracks and moisture behind the walls are other common causes for water damage. If not treated properly, these causes may give birth to severe damage in your property. Not to mention hazardous molds can grow, making a breeding ground for diseases or infections. Whatever the reason be, water mitigation comes to your rescue. Why Is Water Mitigation Important? Damp walls can cause big troubles if not treated timely From floors to bathroom tiles to leaky pipes or roofs, water mitigation holds the cure for all. They provide you with guidance from experts and help you fix the problem. They clean, disinfect, repair, and restore your property making it safe and better to reside. Water mitigation services primarily focus on transforming the impaired property into a better and prudent safe. They use industrial extractors, wood floor drying systems, sub-floor drying systems, and air movers to expedite the repairing process. Also, these services use facilities such as desiccant dehumidification, freeze-drying, and industrial-strength dryers and blowers to effectively remove damp from walls. Greetings everyone! This is Roy Marshall, owner of Total Remediation LLC situated in Tampa, Florida. America is a land of progressive and evolutionary resources. Using caution along with proficient services we can stay safe. You may face destructive problems such as leaking faucets, sewage, leaky pipes, or roofs anytime. Thus, it is better to stay protected and ensure the safety of your loved ones. Avail to such services that provide you best water mitigation facilities ensuring the elimination of any potential for water related problems. Water mitigation services enable you to transform a damaged or worn out property into a pristine and beautiful place you can call yours!
https://medium.com/@marshall-royd/all-you-need-to-know-about-water-mitigation-brian-marshall-florida-36e70d362248
['Roy Marshall']
2020-12-16 12:22:28.041000+00:00
['Water', 'Roy Marshall Tampa', 'Cleaning', 'Mitigation', 'Water Damage Restoration']
Art can help you cope with the pandemic. — Arpita Sharma
I don’t know about you all, but I’ve been struggling a lot recently — feeling more lethargic, lost, and void of meaning. As someone who still has a job, a home, and whose family is relatively healthy given everything, I feel pretty privileged (knock on wood). Even so, this pandemic has taken a major toll on my mental health and really restricted my ability to do the things that gave me joy and helped me be more resilient. One thing that has gone right for me though is my effort to continuing to make art and trying to connect with an art community during this pandemic. At the beginning of this year, NPR shared an article called Feeling Artsy? Here’s How Making Art Helps Your Brain. I’ve been thinking about this piece a lot recently as the pandemic continues to disrupt our lives and makes it difficult to connect, explore, and expand our worlds. I wanted to share some of the key benefits of creating art from the article. Hopefully, it will encourage you to do something a little more creative in your spare time. 1) Art helps you imagine a more hopeful future. Girija Kaimal is an Associate Professor at Drexel University and a researcher in art therapy, leading art sessions with members of the military suffering from traumatic brain injury and caregivers of cancer patients. She wrote a recent piece for the Journal of the American Art Therapy Association theorizing that art-making helps us navigate problems that might arise in the future. She argues that the act of imagination is actually an act of survival. It is preparing us to imagine possibilities and hopefully survives those possibilities. I see this play out often with the folks who come to my art classes. They feel more hopeful and creative when they spend their evenings in a community with others creating something. The process of creation opens up their imagination to new ideas or approaches to the world. 2) Art makes you happier. Kaimal and a team of researchers also discovered that making art may benefit people dealing with addictive behaviors, eating disorders, and mood disorders. They measured blood flow to the brain’s reward center, the medial prefrontal cortex, in 26 participants as they completed three art activities: coloring in a mandala, doodling, and drawing freely on a blank sheet of paper. They found an increase in blood flow to this part of the brain when the participants were making art. The research suggests that art may activate reward pathways in the brain that can benefit them. I have personally experienced this in my own life. I started painting during a period where I was struggling with how to managing life expectations and disappointments. Art allowed me to feel happier and better manage those challenges. 3. No matter your skill level, creating art makes you less anxious about life. A 2016 research paper found that 45 minutes of creating art in a studio significantly lowered stress levels. The paper also showed that there were no differences in health outcomes between people who identify as experienced artists and people who don’t. So that means that no matter your skill level, you’ll be able to feel all the good things that come with making art. A struggle I hear from many people I try to convince to take up art is the fear and anxiety around not being good at it. One of the great things about this article is that it reinforces that you don’t have to be greatly skilled at art in order to enjoy it and reduce your stress levels. If you read this and want to try painting, check out my other article where I share my top 5 youtube art tutorial channels. If you are interested in a community of hobby artists, come to my art class on Wednesday nights. I’d love to have you.
https://medium.com/@arpitasharma/art-can-help-you-cope-with-the-pandemic-arpita-sharma-16d35e2aa787
['Arpita Sharma']
2020-12-27 16:07:07.088000+00:00
['Art', 'Creative Process', 'Creativity', 'Life']
Help For The Unpoetic Man
Most men can’t write good poetry. We know it’s true. So I have created a royalty-free valentine poem that you can use as your own. Free of charge. Just fill in her name at the beginning, and you’re good to go. My darling ___________ Your facial arrangement is more than acceptable. Normal men are intimidated by your beauty. Fortunately, I am not normal. Your eyes pierce my heart with nuclear force, melting the flesh off my skeleton, leaving a crispy, barely beating heart flopping about, easily squashed by passing vehicles or pedestrians. Your voice is more melodious than a thousand songbirds trapped in a cave by unrelenting winter gales before they could fly south. I would bottle the sound if the sound was willing to be put in a bottle and sealed with a synthetic cork. Alas, this is not currently possible with today’s bottling technology. Perhaps a digital recorder would suffice to capture the music of your words, but only if a 96 kHz sampling rate was used. Your character is as flawless as a diamond carved in the vacuum of space by the most advanced alien diamond cutter of all time on the best cutting day of their life which exceeded the quality of their previous best day by a thousand times. I would gladly allow a giant sequoia tree to fall on me if it would grant me your attention for the half-second before I am crushed. Do you like me? Check Yes___ or No___.
https://medium.com/mark-starlin-writes/help-for-the-unpoetic-man-a-royalty-free-valentine-poem-c9e92f3bc764
['Mark Starlin']
2019-07-24 01:32:50.943000+00:00
['Poetry', 'Love', 'Humor', 'Valentines Day']
Myetherwallet wallet added support for ETH 2.0 staking
The popular Ethereum wallet MyEtherWallet has opened access to ETH 2.0 staking via the Staked service. To participate in ETH 2.0, MyEtherWallet users must block 32 ETH for staking. “Staked will launch a validator node for them, which will simplify the participation of users who do not have technical knowledge, so no additional actions on their part will be required,” said Kosala Hemachandra, CEO of MyEtherWallet. Users will receive validator rewards in ETH, but will not be able to withdraw coins until the deployment of the second phase of ETH 2.0, which may take several years. Let me remind you that earlier this week, the Deposit contract of the second version of the Ethereum Protocol attracted 1.16 million ETH (~$689.99 million), which is equivalent to 1.02% of the market offer of the second-largest cryptocurrency.
https://medium.com/@btcxlab/myetherwallet-wallet-added-support-for-eth-2-0-staking-599ef59da491
['Black Square Design']
2020-12-09 11:58:35.181000+00:00
['Ethereum', 'Cryptocurrency', 'Eth', 'Crypto', 'Ethereum Blockchain']
Informe Semanal 10–20 de Septiembre de 2018
in Both Sides of the Table
https://medium.com/telos-es/informe-semanal-10-20-de-septiembre-de-2018-d1b3a5adda4
[]
2018-11-11 17:28:00.971000+00:00
['Crypto', 'Criptomonedas', 'Fork', 'Blockchain', 'Crptocurrency']
Tales of a NYC Sex Worker, Part 4: How to Lose Everything in 10 Days
Kate Hudson was an amateur. Photo by Alexander Krivitskiy on Unsplash. I met Matt on a Friday night in March of 2019 at Jimmy’s Corner, one of the few remaining old-school dive bars in New York City. It was my 21st birthday and I was out with friends, and my sister Dina and her girlfriend Cassie had even come in from Chicago for the occasion. Jimmy’s was our fourth bar of the night, and I was pretty well lit by then from all the free drinks. Matt wasn’t even the first guy that night to take my friends’ singing as a cue to buy me a birthday drink so he could make a pass at me, but he was the cutest. And the tallest. So when the time came to head for what we’d planned to be the last bar of the night, I asked Matt if he wanted to join us, and he was happy to tag along. Our merry band walked over to the Stinger, a much swankier place in a swanky hotel on the swanky side of 8th Avenue. The idea had been to finish only steps away from the subway trains that I would need to get home to Shitsville, Brooklyn, but it made me smile to think that maybe I wasn’t going directly home after all. We had a quieter, fancier drink at the Stinger, which allowed others to get to know Matt a bit, too. Dina poked me lightly in the ribs at one point when Matt said something she liked — I don’t remember what it was, but the message was that she approved. I wasn’t really looking for a boyfriend at the time, but I filed her opinion away for later. Then I said a silent prayer of thanks that she and Cassie had gotten a hotel room instead of trying to crash on an air mattress in my shoe-box-sized studio in Shitsville. When the party started to break up, Matt asked me if I wanted to have one more drink with him before I went home, and I knew what he meant. I reassured my sister that our plans for Saturday and Sunday were still on, then Matt and I watched as everyone gradually made their way out the door. I turned to him and asked, “What do you have to drink at your place?” with all the confidence of a drunk 21 year old, and he grinned with all the confidence of a guy who knew he was about to bring a drunk 21 year old home with him. He lived in Alphabet City, so it took us a little while to get over there by cab, and we took advantage of that time by making out in the back seat. I might have even had my hand down his pants before we got there. He had a one-bedroom apartment to himself, so we were barely in the door before I started undoing his belt. But he quickly took control and took my jeans off instead, and playfully pushed me down onto his overstuffed sofa before kneeling in front of me. He slid my panties off and went down on me for a while, and he definitely knew what he was doing. When we decided to switch gears he went for a condom without my having to ask. Then he lasted several minutes as we tried a few different positions. Each of those things moved him up a bunch of notches in my mind. After he came (and made me cum for the third time), we lay there sweaty on the couch just quietly breathing, until he finally said, “We should do that again sometime.” “I’m down,” I agreed, thinking we would just try to hook up at some point. I needed more good sex in my life. “How about dinner on Sunday after your sister and her girlfriend fly home?” I wasn’t sure I heard him right. “You mean like a date?” He laughed. “Is that so weird?” “I didn’t think you were interested in that way,” I said. And if you hooked up with a hundred different New York City boys on a hundred different nights, the odds would be overwhelmingly in your favor to be right if you always assumed they weren’t interested in that way. “I wasn’t looking for it, but you make me smile,” he said. I shrugged. “I wasn’t looking for it either.” He slipped a hand between my legs and started playing. “Crazier things have happened,” he whispered, and I closed my eyes and leaned into him. Why not be optimistic? I thought to myself, and felt his fingers slide into me. There were some good times over the next several months. I don’t think either of us ever mistook the other for “the one,” and it’s not like we were together every night — we didn’t live close enough for that and our schedules didn’t mesh well. But when we were together we had great chemistry in bed, there were a lot of laughs, and we had some shared interests. I’m not even sure I can point to a single moment where it became clear we weren’t going to last. It was just a gradual shift, a progression of small changes. Like the first time we went away for a weekend together that July, and Matt kept picking fights with me over little things, as if he was annoyed just to be trapped in one place with me for so long. Or when he told me he didn’t trust Dev and his friends — the guys who hung out on a corner in my neighborhood — and I should stop spending time with them. Or the night he showed up at my restaurant and nearly started a fight with the bartender because he saw me lingering at the service station and laughing with him. Or that time he stopped returning my texts, then didn’t explain or apologize when he popped back up on the grid a few days later. But the day I knew something had already gone wrong was in February of 2020, after we’d spent the night at his place. When I awoke that morning, I rolled over to see him already awake and watching me. “We should have sex with another girl,” were the first words out of his mouth. We had already had all the usual conversations about each other’s experiences and fantasies, so he knew I wasn’t into girls at all — and after one drunken threesome a while back with two men I didn’t know, I wasn’t too thrilled about threesomes in general. “I might be okay with bringing another girl to bed with us, but only if you don’t expect me to have sex with her,” I said. “You’d still get to have us both, though.” He shook his head. “It doesn’t even count as a threesome if you and the other girl don’t go down on each other. You should want to do this for me,” he added as I tried to imagine what dictionary he got that from. “If we had sex with another man, would you give him a blowjob? Should you want to do that for me?” I posed. “That’s different and you know it,” he said, raising his voice to an uncomfortable level given that we were still in bed together. “Your sister does it all the time, you should be totally used to the idea.” That made me furious. I got out of bed and stomped off to the bathroom muttering, “That’s not okay.” We dropped the subject afterward, but things were tense for a few days. He had no right to use my sister’s sexual orientation against me, and I decided to draw a hard line in the sand mentally; if he did that again, I would have it out with him. But it was only a couple of weeks later that the world quickly started changing around us. We’d been hearing about the coronavirus — COVID-19 — pretty much since the new year began, but we had no idea how bad it was going to get, or how quickly. And it didn’t take long to realize that New York City was likely to become ground zero for the pandemic in the United States. Then came the most surreal week and a half of my life, at least up until that point. On March 11, 2020 one of the fashion houses for which I’d done some freelance seamstress work called to cancel the order I was already working on. When I asked them how they wanted me to return the material to them, they said they didn’t need it back, which seemed weird. I woke up to an email on March 12 from the company I worked for doing merchandise sales in midtown theaters. Broadway was officially and completely shutting down, until April 12 — an entire month! I’d only been working matinees lately because I was full-time at the restaurant, but that was work I needed if I was going to pay rent and still stay afloat atop the consumer debt I’d accumulated. That same day, about an hour before I would’ve left my apartment to go work a dinner shift, the manager of the restaurant called me himself to let me know they were shutting the whole thing down proactively just as a precaution. It would be temporary, of course, this whole thing would blow over in a month or two, Memorial Day at the outside, and we’d all be back to work. But they had to lay all of us off in the meantime. At least the good news (he said) was that as layoffs, we’d be eligible for unemployment benefits after a one-week waiting period. After I hung up the phone I was pissed. Unemployment might get me $400/week, and even that would be taxable as income. How was I supposed to pay for anything now? Why would they shut down if they didn’t already know they had to? As upsetting as that news was for about 18 hours, it turned out not to matter much once my Hematologist called me on Friday the 13th. “If you don’t absolutely have to leave home, I highly recommend you don’t. Your blood disorder suppresses your immune system, and not only could that make it easier for you to catch the coronavirus, it could also result in a far worse case if you do catch it, perhaps even a higher risk of fatality. It’s simply not safe for you out there right now.” My heart was pounding at this revelation. “All right, but is it still okay to have my boyfriend come over as long as I stay home myself?” I already knew the answer before I finished the question, and the doctor’s cringe was practically audible. “I would caution strongly against it, unless he takes significant steps to isolate himself from others anytime he’s not with you.” After I hung up the phone I texted Matt that I was now completely unemployed and essentially under house arrest. He called and I repeated what the doctor had told me about staying home and how I was even supposed to avoid contact with him unless he made a serious effort. His response surprised me. “If that’s what it takes, I’ll do whatever I can. I’ll wash my hands, wear a mask, keep my distance from people, all of that. If you can’t leave your apartment I’ll come to you. I’ll be there after work tonight, and everything’s going to be fine.” I ended the call and stared at the phone. Why not be optimistic? I allowed myself to think, almost exactly a year after the first time. Life was presenting Matt with an opportunity to step up and be a man, and it sure sounded like he was going to rise to the occasion, or at least try. And I didn’t know if I could do this alone. Matt came over after work that night and stayed the weekend, and he really did help calm me down about the situation. We tried to figure out the best way for me to handle staying in for a month, or even two months if that’s how long it took. We couldn’t get a delivery slot on FreshDirect or any other grocery delivery service because everyone else was panicking too, and the lines at supermarkets and even bodegas were insanely long when Matt tried to go. But eventually we managed to stock my apartment with a decent amount of food that didn’t need to be refrigerated, along with a couple dozen extra rolls of toilet paper for reasons I didn’t really understand. That Sunday happened to be my 22nd birthday and the first anniversary of the night we met. Matt kind of made a big deal of it. He cooked me a simple dinner in my pathetic little kitchen, and lit a candle in a cupcake to sing me “Happy Birthday” afterward. It felt special and intimate. By the time he left for work on Monday morning I actually felt like the whole thing might bring us closer together. I spent a large part of that day reading news stories about the pandemic on my laptop and wondering just how bad it was going to get. Late Monday afternoon, Matt texted that his company was shutting down on-site operations and telling everyone to work from home until further notice, so he had to get everything together and bring it to his place. It just wouldn’t make sense for him to try to come to Brooklyn after that, so he would plan to come over Tuesday night after putting in his first full day of working from home. Even though I’d spent the majority of nights on my own over the course of the year we’d been dating, I felt especially alone trying to fall asleep on Monday night. I was halfway through Tuesday before I even remembered that it was St. Patrick’s Day, which any other year would’ve meant dressing up in green and going out on the town for what a lot of New Yorkers call “Amateur Night.” I wondered whether any of my friends from the restaurant or from Broadway were planning to go out despite what seemed from news stories like an increasingly ominous mood in the city — and I wished I could join them, but not at risk to my life. It started getting dark at 6:30pm, and I hadn’t heard from Matt yet. I forced myself to wait until 8 o’clock before I texted him, and I still didn’t get a response until about 9:15pm when he texted back, “Sorry babe, just swamped! Don’t want you to have to wait up for me. I’ll be there by sundown tomorrow.” I didn’t sleep much. Wednesday I had trouble getting out of bed for the first time in months. I felt isolated and confused, unable to understand why Matt had continued to stay away despite being there for me so convincingly over the weekend. And as the hours crawled by and there was still no word from him, I swore I wouldn’t be the first one to reach out yet again. I didn’t. Neither did he. I went to bed at midnight and fought off tears for hours. When I woke up on Thursday, there was nothing left but to assume that Matt and I were no longer together. He had ghosted me at the worst possible time, and after giving me higher expectations than ever. I didn’t eat anything, I just opened a bottle of cheap Merlot and started drinking. I wrote and deleted about a dozen different emails to him, none of them striking the right balance between betrayal and defiant independence that I was looking for. Just before 5pm my intercom buzzer went off, and I went to answer it wondering if I’d forgotten ordering food. “Yes?” “It’s Matt.” My heart stopped. I buzzed him in. It took him eleven years to climb the four flights to my apartment. When he finally got there he grabbed me and kissed me hard, pulling me into his body even as he pushed us both into the apartment and kicked the door closed behind him. He ripped off my pajamas like they were made of crêpe paper, and threw me on my bed so he could undress. I was speechless and my mind was going everywhere at once, but Matt was here in front of me, and he wanted me. Well he could fucking have me. And boy, well, he fucking had me. And had me. The sunlight snuck onto my bed through the mostly closed blinds, and I smiled before I even opened my eyes. Friday was a brand-new day, and Matt was here, maybe even for the weekend. I rolled over onto nothing. I finally opened my eyes, and Matt was dressed and sitting at what passed for a dining table in my shoe box of a home. “I’ve gotta go,” he said, “but first you need to know something. I didn’t mean for anything to happen.” “For what to happen?” I said quietly. “We met on Tuesday night,” he continued. “And everything just clicked. It’s like we’ve known each other our whole lives. I’m sorry Melinda, that’s just the way it happens sometimes, fate stepped in. You and I had fun but I think we both knew it wasn’t a forever thing. I have to follow my heart.” I sat there stunned as my mind pieced together what he was telling me. When my mouth finally opened again I was shaking. “You… you went out on Tuesday night when you told me you were swamped and working late… and met another woman while I was trapped here by myself? You exposed yourself to other people in the middle of a pandemic when you promised me you would limit your contact, you apparently fell in love with this other woman within a day or two of meeting her, and then you came over here and you fucked me anyway?! And now you’re breaking up with me?” “Melinda, I — ” “Have you already fucked her?” I shouted. He lowered his head and sighed. I stood up and started yelling. “You couldn’t just cheat on me and let that be the end of it, right? You couldn’t just decide you were done with me and break up with me like a man? You had to stick your dick inside someone else first? You had to have one last go at my pussy for old time’s sake before I learn that you’ve already completely fucked me over?” Matt stood up. “I’m sorry Melinda, I am — but I didn’t even think it would be that big a deal. It’s not like we’re engaged or something!” “That’s not the point!” I screamed, grabbing his shirt with both hands, slowly pulling his face closer and closer to mine. “Do you not get what’s going on out there? Do you not understand why you made me that stupid, empty promise that it took you all of four days and a single random fucking skank hookup to break, how vulnerable I am? You could be carrying the coronavirus right now! You might have given it to me last night! I might get sick in a week, and there’s no cure! You might have just killed me!!” He stammered, “She — she’s not sick, nobody was sick. I’m not sick!” “You don’t have to be sick to give it to someone else, you fucking moron!” I yelled, then I let go of his shirt and took a deep breath to try to calm myself. “Get the fuck out. Now.” He started to say something, then changed his mind and turned to go. He had just opened the door when I thought of something. “Matt, wait!” I said. He turned around in time for me to send my knee crashing upward into his crotch with every ounce of my strength. He crumpled onto the floor wailing in pain, and I let him lie there whimpering for a moment. Then I gave him several shoves until I had rolled him awkwardly out the door into the hallway. “Now she can have you,” I said, and slammed the door behind him. I listened with my ear to the door for several minutes, until I heard him pick himself up and walk slowly down the stairs. I expected to start crying again, but for some reason I couldn’t even summon enough emotion to do that. Instead I was numb. I looked around my shoe box of an apartment, and I saw emptiness. I had no main job. No weekend job. No side hustle. No man. No freedom. No safety. No money. No plan. No future. No life. No hope. Nothing. I opened the kitchen cabinet, and let out a little, hollow laugh. For a girl with nothing, I sure had a lot of goddamn boxes of Honey Smacks. Part 5 is now available here.
https://medium.com/@MelindaByNight/tales-of-a-nyc-sex-worker-part-4-how-to-lose-everything-in-10-days-48c258070a69
['Melinda Night', 'Brooklyn Beck', 'Call Girl']
2021-09-17 17:53:31.833000+00:00
['Brooklyn', 'Covid 19', 'Cheating', 'Broadway', 'Unemployment']
A Huge List of Useful Keyboard Shortcuts
The Why When people use the phrase time investment these days, they often mean time suck, as in, “I’d always wanted to rewatch the ‘Lord of the Rings’ movies but it’s such a time investment. Now that we’re quarantined though…” Here and now, let’s reclaim the original meaning of the phrase: that is, time spent doing something that will pay off in the future. You can apply it as big and abstract (going to college) or as small (organizing your sock drawer) as you want.¹ Specifically, we’re here to talk about a time investment that pays off with the thing that money can’t buy: more time. Let’s pretend for a second that every action you take on your computer in the course of a day were represented as a line of code in a file. There would be some big, headline activities that take up many lines, like writing code, surfing the web, writing emails, and listening to music. You would also see a ton of micro-activities, such as switching between applications or opening a browser window. In our imaginary file, there would be so much repetition of those micro-activities that we would be monsters if we didn’t at least try to DRY them out. What are the magic tools that allow us to do this, you ask? Answer: keyboard shortcuts. I know that this is a deeply un-sexy term, but you know what is sexy? Having more time to do stuff that’s not repetitive. Since, after all, this is an article on shortcuts, we’ll abbreviate it as SC going forward. If there’s one thing I want you to take away from this article, it’s this. Every second spent learning an SC will yield hours of time in the future. This is a stone-cold fact; if you don’t believe it, check this footnote.² Ultimately, the choice is yours on how diehard you want to get with SCs. You don’t have to be one of those people who find the idea of using a trackpad repulsive and curse Steve Jobs for popularizing the GUI. Even picking up a few SCs will make your life better. After all, we’re lazy programmers — not doing more work than we have to is our M.O.!
https://betterprogramming.pub/a-definitive-guide-to-all-the-shortcuts-for-new-rubyists-a365a590d16e
['Donny Landis']
2020-04-11 00:12:20.759000+00:00
['Productivity', 'Programming', 'Software Engineering', 'Startup', 'Web Development']
Biking upstairs
Photo by Mohamed Nohassi on Unsplash Biking upstairs Since everything basically shut down and we’ve mostly been inside, I’ve grown to appreciate mundane happenings. Yesterday, I spent some time with my girlfriend strolling around Prospect Park. I love this park for a multitude of reasons, the main reason being life. It’s a pleasant reminder that real people exist. Immediately after making echo-noises with my fellow humans under the mossy ‘Cleftridge’ bridge and juking away from a speeding mini, bike-riding spiderman, we climbed the short set of stairs onto the bridge that crosses the ‘Lullwater’ lake. On a Sunday while walking this bridge, you might find a newly-wedded couple getting coached by a professional photographer to your left, a group of clout-chasing teens sharing a joint behind bare branches to your right, roasting their weakest link for mixing up the names of two important rappers (Which baby is it?), and a funky jazz band playing ‘Red Clay’ at the Boathouse across the pond. It’s the perfect space to evade the Sunday Scaries, a term that only residually applies to me from being in college, but definitely applies to the 9–5ers who have to wake up at 8am the next morning to coordinate god-knows-what for rich upper-west-siders. Even now as we approach freezing temperatures, Prospect Park beats the cooped-up feeling of a stuffy apartment ten out of ten times. With nowhere to go and little to look forward to, a pandemic in the concrete jungle realigns a person’s priorities. As we approached the three steps down, a scraggly man with an old black peacoat and bootcut jeans approached the three steps up, mounting a Citi-bike. He was presumably going to climb the stairs with his circular wheels and minimal momentum. He moved slowly with a Michael Jordan level focus like he’d been there before. I fought my instinct to critique his plan because I wanted him to succeed. I imagined him smiling cheek to cheek while morphing through the obstacle in one swift motion with his head and shoulders floating above the concrete and his legs cycling under the bridge. It was a sight to be seen. Unfortunately, nobody saw it. My guy, now moving at a snail's pace, lifted his front wheel with a vigor that only someone with a strong fringe-belief system could. When the wheel tapped the third step, his force rebounded. This is the conservation of energy, I think. On his journey backward, he twisted his handlebars sideways while his ass slid off the seat and his foot hopped and planted into the dirt. I can only imagine how his gonads felt from the squeeze of the seat. The man didn’t complete his “square peg in a round hole” venture, but he sure-as-hell impressed me. He had nothing to lose, he gambled that nothing into an impossible task, and he broke even. Love that energy.
https://medium.com/@bentaha/biking-upstairs-e9c667d00af
['Ben Taha']
2021-01-04 16:44:13.011000+00:00
['Sketch', 'Brooklyn', 'New York City', 'Life', 'Parks']
Ways to grow your Business
Market research Market research is an effective tool to assist your business planning. It is about collecting information that provides an insight into your customers thinking, buying patterns, and location. In addition, market research can also assist you to monitor market trends and keep an eye on what your competition is doing. Define your market research objectives Sources and types of information Collect, analyse and act on the results Successful businesses undertake market research on a regular basis to: Define your market research objectives It’s important to clearly define your objectives in order to achieve useful results from your research. Clearly defined objectives will help identify the best methods to conduct your research. You will also need to determine the time frame and budget you can allocate to undertake the research. You might consider using a professional market research company to assist you. Sources and types of information There is a variety of data sources to assist you in researching your: Customers Competitors Industry Location ‘Primary research’ refers to information gathered from original sources such as: ‘Secondary research’ is information and data that has already been collected and analysed by other sources such as: Australian Bureau of Statistics Industry and trade publications Social media and websites Marketing and consumer lists Newspapers and media IBISWorld The types of information you collect through these sources may be quantitative or qualitative. Qualitative information measures the values, attitudes and views of a particular sample. This type of information is useful if you want to understand why people buy your products, how they respond to your advertising or their perceptions of your brand. Quantitative information is based on statistics and may be used to predict market penetration, future earnings etc. Collect, analyse and act on the results After identifying the source and type of information you need, you can start to collect it. It is important not to allow your opinions or preferences to affect your research. Having a preconceived idea of the results will bias your research and provide false information. Remain open minded and be prepared for unanticipated results. When processing data make sure you: Keep your market research objectives in mind Categorise data according to what is most relevant for your business, don’t become side-tracked by information that is just interesting Collate your data using tables or lists to make it easier to identify certain trends and themes. You may need to collect additional information if your results are inconclusive. Analysing the data should allow you to draw some conclusions regarding your initial objectives. Update your business and marketing plans with the information collected from your market research. Furthermore, v Get a landing page to capture your potential customer v Convert potential customers into customers Link 1 Link 2 v Discover Content, Schedule Content, and Manage all your social accounts v Engage better with your stakeholders, improve their experiences, run periodic surveys, derive insights, and steer business growth v Create engaging contents about your business for your Articles, Blogs, Social Accounts v Manage your finances
https://medium.com/@skillz008/ways-to-grow-your-business-cfe1860b072
['Olajide Josh']
2020-12-23 22:25:21.646000+00:00
['Growth Mindset', 'Business Growth', 'Business Development', 'Business Growth Tips', 'Business Strategy']
New solar farm in Tooele County will deliver big on renewable energy
“Some of Utah’s big energy consumers will be getting a significant amount of power from a planned solar plant in Tooele County, helping them to reach their renewable energy goals much more quickly than planned. Thanks to approval from the Utah Public Service Commission of a renewable energy tariff for Rocky Mountain Power, the project is due to come online in 2023 and will be one of the largest solar energy generators in the utility company’s system. The six customers with commitments for the energy are: Salt Lake City, Park City, Summit County, Utah Valley University, Park City Mountain and Deer Valley ski resorts. The Elektron Solar project is owned and will be constructed by D.E. Shaw Renewable Investments in collaboration with Enyo Renewable Energy, a renewable energy developer based in Utah. Enyo and D.E. Shaw Renewable Investments will be responsible for generating more than 275 megawatt hours of energy in northern Utah via customer-driven solar capacity under construction beginning in 2021 and into 2022. A megawatt is a million watts of electricity. UVU President Astrid Tuminez said the Tooele solar plant will provide more than 90% of the campus’ electricity needs and put the university soundly on the path of being carbon neutral by 2050. UVU will receive about 23% of the project’s solar generation, said Frank Young, associate vice president of facilities planning for the university. The campus’ energy needs demand about 42,000 kilowatt hours on an annual basis. The Tooele County plant will generate about 40,200 of those hours needed for yearly operations, he added.” View the whole story here: https://www.deseret.com/utah/2020/11/30/21726903/elektron-solar-project-solar-farm-tooele-county
https://medium.com/@tonycowger/new-solar-farm-in-tooele-county-will-deliver-big-on-renewable-energy-9492d9291fa5
['Tony Cowger']
2020-12-04 01:36:52.810000+00:00
['News', 'Renewable Energy', 'Energy', 'Free', 'Solar Energy']
Is Amazon about to disrupt the fashion industry?
IMAGE: Esther Merbt — Pixabay (CC0) Amazon has just launched Made for You, which for just $25 allows you to order a completely custom-sized t-shirt via an app where you enter some information about your size, upload two pictures and define your preferences for collar, sleeve, length and fit. It’s currently only available in the United States and only for t-shirts, made in the US with imported fabrics, but knowing Amazon, this service will soon be expanded both in product range and geographically. Just over three years ago, Amazon acquired Body Labs, a machine-learning company founded in 2013 and based in Manhattan, for an estimated price of between $50 and 70 million. The company specializes in analyzing body shapes, and had raised around $8 million in a previous financing round, and was developing custom avatars and calculating clothing sizes from videos and images. Now, the company’s web page is gone and only shows an old form, ShapeX, where one could register and create an accurate model of one’s body to be used when buying clothing online, but everything indicates that its technology is behind Amazon’s announcement, and that it could be a warning sign for the fashion industry. Buying clothes through a sizing system is clearly a sub-optimal experience: even if the dimensions of all sizes were consistent across different brands (which are not), it is clear that one size defines a certain body shape, which does not necessarily fit the user. In some cases, adjustments or modifications can be made to the garments, sometimes in the brand’s own store, but the adaptation possibilities do not go much further. In the fashion world, the dilemma is between bespoke tailoring and off-the-peg clothes, with modular clothing — where you pick from a certain number of sizes and options for each part of the garment — in the middle. Taking measurements is a complex process: a tailor-made suit typically involves two or three measurements, requires certain expertise, and is costly. How do things change when, for $25, you can have a custom-made t-shirt, with the simple requirement of sending a couple of pictures or making a series of simple movements in front of a camera for a few seconds? Obviously, the difference is in the manufacturing processes: with the manual pattern-making, cutting and sewing that most of the industry operates by, the only way to keep costs down is to manufacture garments with fixed sizes. Moving from this scheme of mass production, typically in countries with low labor costs, to the production of completely customized sizes on demand requires technology beyond the reach of most brands, and also has other effects, such as enabling near-shoring, even in developed countries, by reducing the use of labor applied to purely manual tasks and increasing that of technology and specialized skills. What does Amazon want? Simply, to develop a system that allows it to offer custom-made clothing at the same price as fast fashion. If it can create such a system, knowing Amazon, it will offer it as a platform to brands, which would make the company the hub around which move the big names would revolve. In no time, buying a garment in size M or XL would make no sense. We would go to a store online or in person, choose an item, but receive a tailor-made version some time later. Clothing represents a very important share of our consumption, and Amazon has long wanted to enter that category. As a strategic move, the potential is enormous. If the first indication was the acquisition of Body Labs three years ago and the second is these t-shirts, the third, whatever it is, won’t be long in coming.
https://medium.com/enrique-dans/is-amazon-about-to-disrupt-the-fashion-industry-dcc8d1d68f18
['Enrique Dans']
2020-12-25 10:23:40.643000+00:00
['Retail', 'Fashion', 'Clothing', 'Machine Learning', 'Amazon']
Create Jira issue workflow step for Slack
Today we are excited to announce our latest feature to further reduce context switching and keep your team in sync. With our release this morning you can now include Jira in the steps of your workflow in Slack. When Slack announced the open beta for workflow steps from apps we know immediately that creating issues in Jira would be a powerful addition for the 75% of workflow builders that are non-technical. This zero-code integration provided with Jira Integration+will open up endless possibilities for teams working in Workflow Builder. How it works Navigate to Tools > Workflow builder in Slack Create a new workflow or choose our template “track a new bug” Configure the project in Jira and map your fields 4. Publish your workflow 5. Check out the results in Jira — this works for Jira Cloud, Server and Data Center. The possibilities are endless, what will you build? Let us know your ideas for workflows in Slack with integration to Jira. We can’t wait to see what you build! Let us know what you are thinking the comments below…
https://medium.com/@nextupai/create-jira-issue-workflow-step-for-slack-f8a01f71f2a2
[]
2020-10-07 16:39:11.332000+00:00
['Platform', 'Integration', 'Slack']
Food for Thought #3: Capitalizing on your potential (in 6 points)
Image by gordontredgold.com How often do you dream ? It happens a lot to me, especially daydreaming. I genuinely enjoy getting lost in my thoughts, letting my imagination taking over. Dreaming boldly lets your creativity thrive and opens a whole new world of opportunities. It allows you to draw any future you’d feel comfortable and happy with. But I believe the real power of dreams takes place when you actually achieve them. Why ? Because that’s when you realize what was once unconceivable can suddenly become your reality. It pushes you to be even more ambitious and achieve greater goals. Doors keep opening and nothing remains impossible… You don’t need to be a genius, a visionary, or even a college graduate to be successful. You just need a dream and a framework — Foundr Each and everyone of us can bring value to this world. The thing is, we tend to put people into boxes very quickly, making assumptions on their capabilities. By doing so, we prevent some of them to dream and be aware of what they can actually achieve. However, history has shown multiple times that individuals with genuine passion, commitment and a profound willingness to make a change can overcome the biggest challenges. No matter what they’re told, no matter who they are, no matter where they come from. Here, I’d like to offer my perspective on how to reach your full potential and achieve your dreams. 1) Direction is more important than speed I think the first step is going hard in trying to understand yourself and what you really want. Think about your own definition of success and happiness. Forget about what makes you look nice, focus on what makes you happy. Take the time to reflect on your values and stick to them. Take the opinion of others into consideration but don’t let it become what drives you. Do things with passion and seek for intrinsic motivation. From time to time, take a step back and ask yourself if what you’re doing makes sense. If it doesn’t, don’t be afraid to start over and do things differently. Direction is more important than speed. Many are going fast but in a wrong direction— Nadeem Nathoo 2) Build your knowledge & skill set What are your biggest strengths ? Try to identify what you (could) do well and capitalize on that. Build the knowledge and skills that will provide you with competitive advantages. Let your curiosity bring you to unsuspected places. Be intentional in keeping an open mind and stepping outside your comfort zone. Switch environments from time to time to gain perspective and benefit from the help of people who perceive things you can’t. How can you be passionate about something you’ve never been exposed to ? Try new things and don’t be afraid to challenge the status quo. Never stop learning as you’ll become more capable everyday. If you’re not curious, how are you expecting to learn ? If you don’t learn, how can you understand ? If you don’t understand, how can you have an impact ? 3) Choose the environment(s) you want to evolve in Connect with like-minded individuals who spread positive vibes while challenging you in many ways. Be intentional in surrounding yourself with curious and ambitious people who will make you see the world differently. Join an environment where you feel like everyone around you is smarter, more capable and at the same time aligned with your values. Evolving in an aspirational environment will help you build healthy habits. That’s how you reach high standards, that’s how you grow. As Jim Rohn once said, “You are the average of the five people you spend the most time with”. It can be for the better or for the worse, that’s your choice. Environment is the invisible hand that shapes human behavior — James Clear 4) Leverage your communication skills Imagine having designed the most wonderful project and not being able to sell it because you failed to communicate your message efficiently and accurately. What a waste. Going hard in developing your communication skills is essential to thrive in today’s world. See it as a process where people negotiate the perception they have towards each other. No matter how good your ideas, if you can’t manage to connect with your audience, you won’t get the expected result. First, when people talk, listen completely. Most people think about what they want to say and how they want to say it. Genuinely interest yourself in others, listen and try to understand their logic. Process the information then make up your mind, believe in your message and throw some passion. While making presentations, use the space, be confident, adapt your speech to the moment and be authentic. People feel that and consequently become more receptive. When writing emails, go straight to the point while being mindful of the ton. Prioritize clarity over complexity. Oh and don’t forget non-verbal communication. Even when you’re not speaking, you’re sharing a LOT of information whether it’s through facial expressions, posture, gesture, attitude, choice of clothes, lifestyle… They may forget what you said, but they will never forget how you made them feel — Carl W. Buechner 5) Just do it To me, the worst thing in life is having regrets. Many of us have a ton of great ideas but very few actually take action. Why ? Well, the fear of failing, the perception of others, not willing to leave the comfort zone. The problem is that if you don’t try, if you don’t dare, you’ll never know. Don’t avoid vulnerability, seek for it. Become comfortable being uncomfortable. Keep your goals simple and realistic. Do some planning. It doesn’t have to be as fancy as most people think, it can be just a few words on a piece of paper/phone with a DEADLINE. Elon musk famously said “If you give yourself 30 days to clean your home, it will take 30 days. If you give yourself 3h, it will take 3h. The same applies to your goals, ambition and plans.” People remember what you did, not what you said. Just do stuff and worst case scenario, you learn a lot. Learning by doing makes you more capable, more prepared for what’s coming next and helps you realize how much you can achieve. Don’t call in a dream, call it a plan — Foundr 6) Adopt a Growth mindset A “growth mindset” is opposed to a “fixed mindset” which assumes our intelligence, capabilities and creativity are innate gifts. It’s a deterministic view of the world. On the other hand, a “growth mindset” believes talents can be developed and sees failures as a mean to grow. Carol Dweck, a Professor in Psychology at Stanford University, brilliantly explains in her book Mindset: The New Psychology of Success how living with a “growth mindset” can help you reach higher levels of achievement. She highlights the importance of being passionate, motivated and happy with your projects. She insists on embracing challenges rather than fearing them, persisting when faced with obstacles, seeking feedback to understand and improve, learning from others rather than envying them. You will fail a lot in your life, we all do. But choose your perception towards it. Failing means growing. Look at what you did wrong, re-adjust, make a plan and figure it out. Remember that people who are the most succesful are often the ones who had the most failures.
https://medium.com/@shaan-madhavji/food-for-thought-3-capitalizing-on-your-potential-in-6-points-172be78fc498
['Shaan Madhavji']
2020-12-22 18:15:33.400000+00:00
['Growth Mindset', 'Potential', 'Tips']
Real-time Data Pipelines — Complexities & Considerations
ON DATA ENGINEERING Photo by Robin Pierre on Unsplash The shift towards real-time data flow has a major impact on the way applications are designed and on the work of data engineers. Dealing with real-time data flows brings a paradigm shift and an added layer of complexity compared to traditional integration and processing methods (i.e., batch). There are real benefits to leveraging real-time data, but it requires specialized considerations in setting up the ingestion, processing, storing, and serving of that data. It brings about specific operational needs and a change in the way data engineers work. These should be taken into account when considering embarking on a real-time journey. Use cases for leveraging Real-time Data Streaming data integration is the foundation for leveraging streaming analytics. Specific use cases such as Fraud detection, contextual marketing triggers, Dynamic pricing all rely on leveraging a data feed or real-time data. If you cannot source the data in real-time, there is very little value to be gained in attempting to tackle these use cases. Besides enabling new use cases, real-time data ingestion brings other sets of benefits, such as a decreased time to land the data, need to handle dependencies, and some other operational aspects: If you don’t have a real-time streaming system, you have to deal with things like, okay, so data arrives every day. I’m going to take it in here. I’m going to add it over there. Well, how do I reconcile? What if some of that data is late? I need to join two tables, but that table is not here. So, maybe I’ll wait a little bit, and I’ll rerun it again. — Ali Ghodsi on a16z Infrastructure for Real-time Data flows ClickStream Ingestion : Ingesting clickstream data often require a specific infrastructure component to be present to facilitate that. Snowplow and Dilvote are two open-source clickstream collectors. Simultaneously, Google Analytics 360 allows raw data export of clickstream data to BigQuery, and some CDPs like Segment or Tealium allow to capture and export clickstream data to streams or databases. : Ingesting clickstream data often require a specific infrastructure component to be present to facilitate that. Snowplow and Dilvote are two open-source clickstream collectors. Simultaneously, Google Analytics 360 allows raw data export of clickstream data to BigQuery, and some CDPs like Segment or Tealium allow to capture and export clickstream data to streams or databases. Ingestion framework: Frameworks such as Apache Flumes, Apache Nifi, offering features such as data buffering and backpressure, help integrate data onto message queues/stream. “When I introduce Nifi to people, I usually say that Nifi is the perfect gateway to get the data in.” — Pierre Villard, Senior Product Manager at Cloudera Message Bus / Streams : A message bus, streams is the component that will serve to transfer the data across the different components of the real-time data ecosystem. Some of the typical technologies used are Kafka, Pulsar, Kinesis, Google Pub/Sub, Azure Service Bus, Azure Event Hub, and Rabbit MQ, to name just a few. : A message bus, streams is the component that will serve to transfer the data across the different components of the real-time data ecosystem. Some of the typical technologies used are Kafka, Pulsar, Kinesis, Google Pub/Sub, Azure Service Bus, Azure Event Hub, and Rabbit MQ, to name just a few. Processing: Different processing framework exists to simplify computation on data streams. Technologies such as Apache Beam, Flink, Apache Storm, Spark Streaming can significantly help with the more complicated processing of data streams. Different processing framework exists to simplify computation on data streams. Technologies such as Apache Beam, Flink, Apache Storm, Spark Streaming can significantly help with the more complicated processing of data streams. Stream querying : It is possible to query streams directly using SQL, like the type of languages. Azure Event Hub supports Azure Stream Analytics, Kafka KSQL, and Spark offers Spark Structured Streaming to query multiple types of message streams. : It is possible to query streams directly using SQL, like the type of languages. Azure Event Hub supports Azure Stream Analytics, Kafka KSQL, and Spark offers Spark Structured Streaming to query multiple types of message streams. Decision Engine: real-time actions need real-time data and a way to process this information systematically. Decision engines help make the incoming flow of data actionable. There are two main types of decision engines stateless (e.g., CLIPS, Easy Rules, Gandalf) and stateful decision engines (e.g., Drools). real-time actions need real-time data and a way to process this information systematically. Decision engines help make the incoming flow of data actionable. There are two main types of decision engines stateless (e.g., CLIPS, Easy Rules, Gandalf) and stateful decision engines (e.g., Drools). ML Framework + Processing: Machine learning models can be leveraged within a real-time architecture. They can help make better decisions by calculating scores, such as the propensity to fraud. Different types of framework exists with varying degrees of sophistication, such as XGBoost, Tensorflow or Spark MLib. Machine learning models can be leveraged within a real-time architecture. They can help make better decisions by calculating scores, such as the propensity to fraud. Different types of framework exists with varying degrees of sophistication, such as XGBoost, Tensorflow or Spark MLib. Data Store: Depending on the specific integration needs, leveraging real-time data might require some fit for purpose data stores. Specific OLAP type of database such as Druid, might be required to do slice and dice analytic on the incoming data, HTAP datastore such as Kudu, Cassandra or Ignite might be required for handling specific enrichments, Elastic Search for needle in the haystack type of queries, S3 for long term archival purposes, RDMBS or even leveraging the stream directly (using Kafka directly for instance). Depending on the specific integration needs, leveraging real-time data might require some fit for purpose data stores. Specific OLAP type of database such as Druid, might be required to do slice and dice analytic on the incoming data, HTAP datastore such as Kudu, Cassandra or Ignite might be required for handling specific enrichments, Elastic Search for needle in the haystack type of queries, S3 for long term archival purposes, RDMBS or even leveraging the stream directly (using Kafka directly for instance). Query Federation: With such a diverse ecosystem of datastores, having the ability to query them using the same interface and tool becomes a growing need. Tools such as Spark and Presto provide this type of query federation. With such a diverse ecosystem of datastores, having the ability to query them using the same interface and tool becomes a growing need. Tools such as Spark and Presto provide this type of query federation. Dashboarding: Different types of dashboards are available to handle real-time use cases. While it is still possible to leverage traditional dashboarding solutions such as Tableau, solutions such as Grafana or Kibana are usually more appropriate. Ingestion Source of data There are different sources of data that can be leveraged in a real-time pipeline. Data can be sourced from external services, internal Back-end applications, front-end applications, or databases—the type of source dictating the available integration patterns. External Applications: There are different ways external applications might be integrated into a real-time pipeline. The typical ways rely on webhooks, creating a specific API consumer, or having them publish directly onto a stream/message queue in a “firehose” manner. Internal backend Applications: Internal back-end applications have quite a few ways to publish events to other applications by calling an API, connecting directly to a stream, or leveraging an integration SDK. Front-end: Real-time event ingestion from the front-end is typically handled by combining an event ingestion framework (e.g., snowplow), Tracking Pixel, and Tracking Script. Besides allowing the capture of granular click data, this type of approach has the added benefit of allowing some ad blockers' bypass. Database: To ingest data in real-time from Databases, it is possible to leverage the Database bin logs. Database bin logs contain the records of all the changes that happened on the database. Bin logs have traditionally been used in database replication but can also be used for more generic real-time data ingestion. Infrastructure To ingest streaming data, including from the front end, several components are needed: A collector application, essentially an API that is there to receive data from the front-end and any back-end application that wishes to call it through a web service. a message broker to transport the data across applications in real-time A schema registry to validate the events coming in. a (typically front-end) SDK to send the information in a structured way to the event collection pipeline A tracking pixel to track activity where Javascript might not (always) be enabled. Collector Application: There are multiple ways to set up the infrastructure for Streaming ingestion. Basic collector applications can be set up in ~10 minutes (including schema validation) using low/no-code kind of toolings. More extensive setup can be obtained through leverage open source components such as Snowplow, for instance. Message Brokers: Regarding the message brokers, they come into different variations; some, for instance, are more oriented towards “stream” processing, more easily supporting the replay of events. This is the case of solutions like Kafka and Kinesis; they are particularly well suited for computation of real-time aggregates and dealing with architecture patterns, such as the Kappa Architecture, an architecture pattern dealing with sourcing data directly from historical streaming data. The main drawback of dealing with this type of message broker is their poor “transactional” handling, making it ill-suited to deal with high-value messages type of integration. For instance, if you were looking to integrate into a CRM system, it would be beneficial to know that a given order couldn’t be pushed to the target system and that after 10 retries, it had still failed to reach. More transactional types of message brokers such as Service Bus offer these kinds of features such as transactional locks and dead letter queues. Dealing with different types of messages also has implications as to how the messages broker would be configured. Message brokers such as Kafka can offer different “delivery guarantees,” providing guarantees as to whether a message can be processed at least, at most, or exactly once. Depending on the options chosen, it may lead to a higher likelihood of event duplications. For messages brokers, not strong in the transactional aspect. In case messages are not getting processed correctly, they need to be handled into a separate stream and stored for further processing, hoping that they will then be processed correctly by that further stream after the fact or “curated” ad-hoc, increasing overhead. Schema validation: With regards to schema validation, different types of tooling exist to help with the management of schema and enable schema validation in real-time such as iglu, confluent schema registry, … Kafka, for instance, has direct integration with a schema registry and provides a built-in way to do schema validation in which case, validation errors can be thrown directly to the producer providing feedback as to the issues. Other approaches rely on a more asynchronous way of validation. SDK: A SDK is particularly useful to simplify the integration of data flowing through the real-time pipelines. This is particularly the case when dealing with front-end events. SDK simplifies the integration of events into a common structure, automatically providing attributes to these events based on context (think about attributes such as whether or not the environment is a production one, what is the userId associated with the events, what is the IP of the client). Many of these enhancements are already pre-build in SDKs such as Google Analytics’, but quite a few data-driven companies take it to themselves to develop their own to power their data collection initiatives. For instance, this is the case of slack, which creates specific libraries for tackling front-end event data acquisition. Tracking Pixel: A tracking pixel provides a way to extend one’s tracking off the site, where Javascript might not be supported. This is, for instance, the case in email clients in which Javascript is very rarely supported. It is worth noting that the decisions as to what component to include within the infrastructure is highly dependent on whether or not there is some specific need to track front-end clickstream data directly and if there is an advantage of getting extra speed from leveraging an existing ecosystem. For instance, if there is only the need to source data from back-end applications, there might not be a need for an event collector, SDK, or tracking pixel. The backend application can be responsible for validating its front-end component and integrating directly into a message topic/stream. Most message servers such as Kafka can handle concurrent write from diverse applications. Therefore, they can support receiving messages directly from multiple applications without the need for a centralizing “event collector.” When looking to source clickstream data from the front-end, applications such as snowplow offer additional feature already built in such as Geolocation lookup, tracking scripts, pixel, and plugins in applications such as DBT to offer common types of processing on the data, or google analytics to provide a simple hook to onto GA’s tracking to provide granular website visit data. Event content Handle different types of events differently. It is good to differentiate between the different types of events, be them technical events, front-end events or backend ”business events,” or surrogate database update events. It is important to provide a structure that can easily differentiate these different event kinds, to, as much as possible, handle their ingestion and changes programmatically. Different types of events also need different kinds of information. Database events to be fully traceable require to have fields related to the date of creation and update. Front-end data might need specific data such as UserAgents, IP address, etc., to be enriched to be fully effective (e.g., geolocation, devices …) or client time to more accurately calculate. They also have different storage needs; technical events most likely only need to be stored in hot (such as Elastic Search) and cold(er) storage (such as S3), often only requiring summary metrics to be exposed into “warm” storage. While business events and database entities changes most often need to be stored in warm storage (like a data warehouse). Standardization and Planning To minimize the work needed to maintain the real-time dataflows, it is important to take several steps towards a schema definition and establish data contracts between the originating applications and the consuming applications such as the data warehouse. In this context, special attention needs to be paid to the event structure. Setting up the right event structure can avoid extra work, make the data more accessible and improve the general data quality. The data planning: This is the first step towards defining a common understanding of the data that is going to be sent. This step allows making sure that the requirements are going to be fulfilled by the data being sent. The added visibility allows to create the appropriate downstream table structures properly, implementing the data quality checks, and making the right inference of the resulting data. It is important during the data planning phase to understand the different business processes around the data. This is needed to provide the right insights as to how to process the data. Walmart, for instance, explained that as part of its process for preparing for an event-driven data. Structuring Event Schemas: There are multiple trade-offs to handle when defining schemas for events, their generalization vs. specificity, how the information contained within them will ultimately be processed and accessed, the flexibility of adding new information to the messages without impacting running processes, and the adaptability of messages to future requirements. Although created for a completely different purpose (use of structured data for SEO purposes), the schemas provided by schema.org often provide a good starting point to provide the structure for an event, and it’s attributes. Depending on how the data will be processed, there should be some consideration as to whether to allow deeply nested structures within the schema or whether it would be needed to flatten them or promote some of their key attributes to the root of the event. SQL, for instance, is not particularly friendly with dealing with deeply nested structures. Providing an extra_data property within event schemas allows events to handle a certain degree of structure, for instance, for properties that need to be generalized across multiple sources, events, or that need a strong enforcement. Event Generalization: As the amount of data and its complexity grows, it is important to make sure that data is sent generically to mitigate the downstream impact of changes. The generalization makes it easier to apply enrichments or processing downstream without too many adaptations, for example, to calculate A/B test experimentation results out of the ingested data. It also makes it easier to analyze directly from raw data; imagine the comparison between having to deal with an event called “customer_interaction”, containing all channel interactions (channel in: email, phone, SMS, in-store…) and type of interactions (e.g., WELCOME, PURCHASE…), versus having to go through 10s of events like SMS_WELCOME or STORE_ADVICE. Having particular events makes it much harder to analyze and ensure that every channel/interaction type will be included in the analysis. Depending on the context, the implementation of this type of event generalization typically happens either through the setup of specific contracts (specialized schema/APIs) or through the incorporation within specific logging framework / SDKs of specialized entities target. The importance of a specific logging framework is not to be under-estimated. They provide the starting point towards enabling and enforcing specific event structures. Naming conventions: Naming conventions help people get a quicker and better understanding of the data and assist the downstream processing of events and ad-hoc analysis. They apply to both the event fields as well as the content of these fields. Take, for instance, the field utm_campaign. The field name itself indicates that it relates to the campaign that brought the visitor to the website. Its’ content, a string, is likely as well to be obtained through naming conventions. An utm_campaign parameter such as ENNL_FB_Retargeting_Cat_BuyOnline can, for instance, indicate the language of the campaign (EN), the country of targeting (NL), the platform of targeting (FB), the mode of targeting (re-targeting in contrast, for instance, with demographic or interest targeting), the mode of delivery (Catalogue ads on FB) and the type of the Campaign BuyOnline. In the above example, these naming conventions make the information easier to consume than containing merely an identifier and does not require some metadata table to be loaded in to be interpretable. Event Grammar & Vocabulary: Building an event grammar is the next step to make event data more understandable and increase the data's degree of generalization. The event grammar describes the different interactions and the different entities. The activity stream standard provides a structure and vocabulary for capturing different events “Actions”. Technical Events Technical events often need to be ingested and surfaced so that the engineering team has access to the proper log data to debug their code. Sometimes, given the type of integration and nature of the application, it might be necessary to leverage historical logs and look back at what happens a month or a year ago for a given person or process. This can be the case when dealing with a subscription type of product. Questions, such as there was consent given to store a payment token, did it end up getting saved properly, what was its’ expiration date' questions that may arise when dealing with a payment failure now. Some developers advocate for logging everything that’s happening within an application, including trace and unique sequence numbers, capturing every hop's meta information, such as how long a specific request took. The way technical events tend to get accessed, store, or process does not usually require the same strictness of structure that business or database change events would require, instead favoring the ability to search through them for specific occurrences. These requirements should be reflected in a more flexible schema, and a need for less strict field level validations than for business events. Front-end Data and tie into google analytics When looking to capture front-end events, a question that often comes up is how to ensure that we can capture and gain access to all the raw events pushed to Google Analytics. There are different approaches to achieving such a goal, all with a different tradeoff between flexibility and integration cost. Depending on the approach taken, leveraging streaming data ingestion from the front-end may duplicate work or may end up in some events not being propagated to the Data lake. Google Analytics 360 — Provides a way to export the raw events data from Google Analytics in near real-time . Google Analytics 360 has direct integration with Big Query making the data directly and easily accessible. This approach has the same restriction regarding what data can transit through this integration (e.g., no PII) as Google analytics and requires the upfront cost for the Google Analytics 360 suites (~$150k/year). — Provides a way to export the raw events data from Google Analytics in Google Analytics 360 has direct integration with Big Query making the data directly and easily accessible. This approach has the same restriction regarding what data can transit through this integration (e.g., no PII) as Google analytics and requires the upfront cost for the Google Analytics 360 suites (~$150k/year). Google Analytics Custom Task: Possible integration patterns such as creating custom tasks for google analytics, to add another receiver to the data exist, allowing to hook into Google Analytics — but provide less flexibility in terms of providing additional attributes to be sent to the backend and putting the tracking script at the same consent restriction as google analytics. Google Analytics tracking scripts are furthermore often blocked by several adblockers, providing poor events coverage. Possible integration patterns such as creating custom tasks for google analytics, to add another receiver to the data exist, allowing to hook into Google Analytics — but provide less flexibility in terms of providing additional attributes to be sent to the backend and putting the tracking script at the same consent restriction as google analytics. Google Analytics tracking scripts are furthermore often blocked by several adblockers, providing poor events coverage. Tag Manager-ish: middle-ground approach such as using a Tag Manager, Custom Tracking script, and leveraging the same Data Layer as Google Analytics, provide added flexibility in terms of the selection of attributes, and on how to deal with GDPR consent restrictions, at the cost of added configuration in the Tag Manager. A similar approach can also be made server-side when leveraging specific Customer data platforms such as Segment, which allows routing specific data such as Google Analytics and a Database. middle-ground approach such as using a Tag Manager, Custom Tracking script, and leveraging the same Data Layer as Google Analytics, provide added flexibility in terms of the selection of attributes, and on how to deal with GDPR consent restrictions, at the cost of added configuration in the Tag Manager. A similar approach can also be made server-side when leveraging specific Customer data platforms such as Segment, which allows routing specific data such as Google Analytics and a Database. Specific Integration: Creating a separate integration provides the most flexibility of all, decreases the likelihood of having the tracking data blocked by adblockers, but comes at the cost of an increased integration effort. Business Backend Events For “business” backend events, i.e., specific “factual” events, it is important to leverage some type of standardization when dealing with events. When integrating specific types of data flows, leveraging a common unified or canonical model helps secure a perennial integration dataflow for each type of event. For instance, an e-commerce website might be interested in specific events such as Order Purchased, Order/Item Shipped. These events can be standardized, to a certain extent, across different solutions into these canonical models. This standardization also applies to the minimum set of attributes provided for each event, such as the specific shop id, for instance. Other types of information are usually necessary to provide when pushing these types and needing to correlate them across time, particularly when dealing with a distributed application context. It is important to make sure that instead of relying on a created_at timestamp either generated by the applications/event processing pipeline but rather on leveraging sessions/actions and sequence numbers. Relying on created_at timestamp can lead to the wrong assumption of ordering in a distributed context. The impact of relying directly on create_at timestamp is highly dependent on 1) the data velocity of the application, 2) on the overall ingestion speed, 3) on the technical infrastructure setup — whether, for instance, session affinity has been configured for the infrastructure. Database change events Database change types of events should be provided in an event programming CDC type of way. There are a few reasons for leveraging an event programming approach to CDC, rather than leveraging bin-logs directly (typical CDC): Decoupling the implementation of the operational data store and the dataflows. Providing additional information that we might not want/need stored within the operational datastore. In this approach, the underlying database entity changes end up being reflected in the data platform through events operation such as “creation”, “update”, “deletion”. Therefore, there should be a separation between the different event names, database target entity, and the type of change being applied. The way updates are performed on the databases should be provided consistently, i.e., the fields propagated as events should either contain only the changed fields or a holistic view of the entity (last snapshot) or provide a clear distinction of how they should be processed. The logging framework / SDK should force the inclusion of certain data fields, such as “created” and “updated” timestamp originating from the database entity. Having a consistent set of operations, and these different timestamps and means of integration, allows to propagate and integrate the changes originating on the different domains back onto the data platform. Control structure There needs to be a certain level of control and checks — Data Ops to leverage events and ensure a decent data quality level. End 2 End testing When leveraging real-time ingestion, it is crucial to put the right safeguards to avoid having regressions on the data feeding into the data platform. End to End tests encompassing the data production journey from the originating application to the datastore should be performed to safeguard the quality of the data ending up in the data platform. Validations, Data Quality checks, and Data Monitoring Handling data being pushed through events requires some validation in place. Schema and attribute should be validated. Schemas can be validated through native message broker capabilities (e.g., Kafka) or through specific applications (e.g., Snowplow); validating attributes is a more complex affair. We need to distinguish static attribute validation, which can be included in a schema such as AVRO/JSON or Protobuf for validation. More dynamic types of attributes require another form of validation altogether. To a large extent, in dynamic validation, most of the onus for validating the data should be on the originating application. Nevertheless, a process should be set up on the receiving end (i.e., the data platform) to ensure that it conforms to expectations. To ensure that the data platform can perform its own share of the validation process, it is important to have specific rather than generic definitions or the different events being sent, for instance, to ensure that that field truly corresponds to its content (e.g., an order_id ending up in a cart_id field). Data validation and monitoring do not just stop at Schema and attribute validations. Automated checks should be performed to identify whether certain attributes haven’t been sent in a period of X days or referential integrity between the different events. Regressions happen in code, and setting up a proper control structure help minimize the impact, or change in the application might not have been properly communicated or handled. When dealing with events type of sources, validations and checks should happen in multiple layers : at the originating source : To ensure that the data received conforms to expected values and/or the business logic : To ensure that the data received conforms to expected values and/or the business logic at ingestion time: Ensuring that schema is expected and that the data platform is not receiving unexpected events. Ensuring that schema is expected and that the data platform is not receiving unexpected events. and post-ingestion: Checking for things such as referential integrity, as these assumptions end up being relaxed during the transfer of data, or lifecycle checks to ensure that all the events were received for a given series of actions. Processing Real-time processing Stream / Message Enrichment Streams typically need to be enriched to provide additional data meant to be used in realtime. They can either do lookups on additional services, databases, do first stage ETL transformations, or add machine learning scores onto the stream. List of Enrichments available for Snowplow Enrichments of messages typically happens through a producer/consumer or publisher/subscriber type of pattern. These enriched applications can be coded in any language and often do not require a specific framework for this type of enrichment. Although specialized frameworks and tooling exist, such as Spark Streaming, Flink, or Storm, for most use cases, a normal service application would be able to perform adequately without the overhead, complexity, or the specific expertise of a streaming computation framework. Stateful Enrichment and Cleanup Stateful enrichments and cleanup of the data might be needed to be used downstream. Stateful enrichment: Event-driven applications might need to consume data containing data enriched with historical data (i.e., state). Think of a potential trigger providing you a discount to purchase a product if you have been visited the website at least three times in the past 24 hours without ordering. The downstream application that will ultimately decide whether to offer you the discount will need to know that you are currently visiting the website (real-time event), your history of visit (X visits in 24 hours), and your order history (has ordered in the past 24hours) to decide to offer a discount or not. Stateful cleanup: This can be the case when attempting to use customer data coming from multiple sources to be used in CRM systems that want to leverage a 360 view of the customer, for instance, to leverage contextual marketing triggers: “The success of a digital business is all about being relevant to the customer at every interaction. Relevance is contextual. Therefore we start with a fundamental requirement of being real-time and be able to respond to events in the customer’s timeline rather than the marketing campaign timeline. That’s why the Customer 360 golden record must be made into a Customer Movie (timeline events by each unique customer) and that becomes the core of event-driven data architecture” — Lourdes Arulselvan, Head of Data Archictecture at Grandvision, Former Product Manager Decisioning at Pegasystems In that specific case, some initial merging and unification of customer data might be required before feeding into downstream applications. A more thorough processing could happen afterward, in an offline/batch propagating the different applications' changes. Stateful deduplication: Some message brokers offer at least once delivery option, creating the need to deduplicate events. Depending on the specific option chosen, some solutions such as Azure Service Bus offer native messages deduplication option while others might require an external stateful deduplication. Aggregations There are two main use cases for performing aggregations on data streams. Providing up to date real-time analytics data, for example, through dashboards. Speeding up the downstream — typically batch — computations for very large datasets. These operations can typically be handled through applications spark (see spark structured streaming) or Presto, which can perform time window aggregation. Rule Engines, Complex Event Processing, and Triggers Rules engines, complex event processing (CEP), and real-time triggers, help convert the collected data into operational intelligence. There are different rule engine types; CLIPS, Pyke, Pyknow, and Drools are just some of the different open source rule engines available. Rule engines come in different flavors and support different standards and languages. Some are stateless; some are stateful; they can support rule language such as OSP5, Yasp, CLIPS, JESS, or their own language construct and standards such as RuleML or DMN. Business rules management systems that leverage the DMN standard can benefit from a wide set of editors, which allow modeling, visualize and export the decision logic in the DMN format without requiring code to be written for its implementation. Thus, allowing for stronger collaboration with analysts and the business in the implementation of complex event-triggered logic. The drawback of the DMN format is that it relies on stateless computation.
https://medium.com/analytics-and-data/real-time-data-pipelines-complexities-considerations-eecad520b70b
['Julien Kervizic']
2020-12-22 19:43:30.724000+00:00
['Data Engineering', 'Big Data', 'Data', 'Database', 'Streaming']
Part two: Secure remote working or secure remote tech?
People decision trees As we discussed in part one, there is no one size fits all when it comes to a “secure” remote worker. You can load up as much secure tech as you want, but if your workforce doesn’t understand how to work securely in their new distributed world, you may as well not bother. The bad news is; there’s no quick fix. As with any decision, there are extenuating factors to consider and not just which tech works with which device. Some elements are personal circumstances. Remote preparedness As you’ve discovered, many of your workforce will never have worked from home, but have you thought about whether they have the right conditions to do so? Laptop sales grew 2.8% in the second quarter of 2020, highlighting the scramble for suitable work from home equipment. I think we can safely say we knew they needed the right equipment if they didn’t already have a portable corporate device. I know of a few companies who began to prepare for lockdown by surveying remote working suitability, quite early in the year. I’ve not seen one of these surveys so unfortunately, I’m unable to comment on their thoroughness into circumstance. Circumstances, to name a few Let’s delve into a few extenuating factors for “secure” remote preparedness. Living conditions: Do they live in a suitable working environment? Is it a shared property, with shared WiFi and communal areas? Do they have access to a practical, comfortable and private location to do their work? Can they work from home safely (and I mean physical safety)? Are they able to manage childcare whilst schools are closed? Finances: Do they have a partner who’s suddenly out of work, meaning cash is a bit tight? Do they have the home resources to effectively and efficiently work? Can they manage the increase in utility costs by redirecting commuting costs into household bills? Are they able to sustain themselves without a subsidised canteen or the team biscuits? Do they have suitable internet connections, that aren’t limited by fair usage, without additional cost? Cohabitation: Do they live with anyone who would pose a risk to them or the business? If you require background checks, how do you manage relationships which may gain access to company data? Are they able to carry out their work away from cohabitees? Education/knowledge: Are they able to carry out the tasks independently, away from a support network? Do they have the cognitive skills to adapt? Do they have the physical tools and the knowledge of their use to work effectively? Do they have refreshed data protection and secure working knowledge, in light of the changes made to working conditions? Mental wellbeing: Are they able to cope with the isolation of remote working? Are they going through tough or challenging times that can have an impact on their mental wellbeing? Are they scared/anxious/nervous about new ways of working, their job security or the global situation? The list is endless! Do any of these ring any bells from when you’ve previously encountered an accidental or malicious insider? Data and system access controls We wouldn’t be doing our job correctly if we weren’t all over data controls and who has access to what, and for what purpose. As part of the remote working transition, did you make any reviews to these controls? If you didn’t, I hope the cogs are now turning. Let me give you a few scenarios; Worker: Female, aged 19, Call Centre Operative Living Conditions: 6m x 5m room in a shared house with shared WiFi and bathroom Finances: a 20% drop in household income, increased living expenses Cohabitation: Shares room with a furloughed partner who has a higher than average amount of high-interest debt Education/knowledge: Often used as an example of how to perform tasks and communicate with customers. Carried out data protection and information security refresher 10 months ago Mental wellbeing: Due to the stress of increasing bills and a loss of income, her partner is becoming increasingly agitated. They live in a small room and are unable to distance from each other while in lockdown. As a Call Centre Operative, she has access to customer data, such as name, address, email, direct debit information. She accesses the systems through shared WiFi, using the corporate VPN. Your worker doesn’t always lock her screen when visiting the bathroom or grabbing a drink, after all, she’s working from her room and they only other person in there is her partner, and they overhear her work calls anyway. She reuses the same password for her work and personal accounts. What could possibly go wrong? Worker: Male, aged 26, Marketing Executive Living Conditions: 2 bedroom flat, private facilities and connections Finances: Maintained salary and household income Cohabitation: Lives with a roommate of a similar income bracket Education/knowledge: Talented in his field, known as a little workshy but makes up for it with creative ideas. Was due to take information security refresher 2 months ago Mental wellbeing: Is enjoying getting up late, but misses social connections and the team snacks As a Marketing Executive, he often receives creative assets through file transfer systems, from Creative Agencies. The company doesn’t have a preferred file transfer system. He and hs his roommate often work to music and play office Olympics in their living room (home office). He has access to company social media channels, via a shared login as well as other web assets. With private WiFi, and no systems requiring the use of a VPN, he rarely connects, in fact, the VPN sometimes causes issues when using some of the sites he needs for work. What could possibly go wrong? Worker: Female, aged 40, Head of Department Living Conditions: 3 bedroom suburban house, private facilities and connections Finances: Maintained current salary and household income Cohabitation: Lives with a long-term partner and their 5-year-old child Education/knowledge: Well regarded by team, peers and stakeholders. Carried out data protection and information security refresher 3 months ago Mental wellbeing: Is adapting well to remote working, it’s not the first time she has worked from home; however, it is the first time with a house full As a Head of Department, she has access to confidential staff data, such as name, address, email, timesheets. She can also access internal finance systems, and approve purchases and funds transfers up to the value of £50,000. Systems are accessed through private WiFi, using the corporate VPN. During her evening bath, she often lets her 5-year-old watch YouTube on her corporate machine, while her partner goes for a run. Desktop notifications are enabled for email and instant messaging. What could possibly go wrong? Worker: Male, aged 33, First Line Support Living Conditions: Living at home with parents Finances: A 66% drop in household income Cohabitation: An only child, living at home with parents who’ve both been made redundant Education/knowledge: Due to the job, feels he has an excellent understanding of secure working and IT systems. He carried out data protection and information security e-learning upon joining the company two years ago. Mental wellbeing: With both parents out of work, there is mounting pressure to support the household financially. With limited social interactions, other than through digital means, he begins to feel overwhelmed and isolated. As First Line Support, he has access to staff data, such as name, email and some admin accounts through a shared login. Very familiar with using the right tools, such as VPN to access support queues. He spends most of his time in his bedroom, away from the family and distractions, other than a gaming PC. At the moment, the family don’t have enough food to sustain them as they usually would, they are mainly surviving on cereal and water until his parents’ Universal Credits kick in. What could possibly go wrong? Risk assessing secure remote preparedness It’s near on impossible to risk assess preparedness if you don’t have an understanding of your workforce. If we look back to the early days of school closures when teachers remained at work for the children of key workers or vulnerable children. Those at risk were highlighted, and even that didn’t secure their safety. According to Sky, lockdown saw a 53% increase in child abuse cases. We know that fraud also went through the roof, along with domestic violence. Statistically, one of your team is at risk of poverty, violence or worse. Do you know how far they’d go to survive? How can anyone on the brink of existence behave securely? So where do you start? Firstly, this isn’t a team project; this is a business-wide project. This is where collaboration is vital. HR/P&C must be involved, DP must be involved, Security must be involved, IT must be involved, Risk must be involved, everyone has a responsibility for information security, data protection, employee wellbeing and more! A platform for assessment Here’s an example of what a “People Security” decision tree could encompass, but, again, the possibilities are endless and belongs to everyone! Whilst the current situation did seem to be sprung upon us; Bill Gates did call it back in 2010. From the outside looking in, we did seem to have lost touch with our dusty, filed away Business Continuity Plans, and I do wonder if theses plans considered the interpersonal, economic and social impact on our teams. The moral of the story I guess it’s ‘Seek to understand before you are understood’. To create a secure remote worker, you must understand the worker. What motivates them, what hinders them, their knowledge and understanding and ability to roll with the enforcement of change.
https://medium.com/@infosecjem/part-two-secure-remote-working-or-secure-remote-tech-8155d2bdbee3
['Jemma Davis']
2021-01-18 13:57:46.476000+00:00
['Information Security', 'Cybersecurity', 'Cyber Security Awareness']
Ignite’s Dual Token “Segregation” Strategy
Ignite’s Dual Token “Segregation” Strategy Why we are doing it, how it works and what it means for you… Before we begin, we need to clear one thing up: IGNITE FULLY INTENDS, AND HAS ALWAYS INTENDED, TO HAVE AN EXCHANGE-LISTED, TRADEABLE TOKEN. Good, that’s that out of the way… Back in the dim, distant and murky past of Ignite’s development (about 6 months ago!), we took the decision to make it so that the IGNT token would not be tradeable on exchanges — surely suicide when trying to raise capital via a token generation event, right?!?!?! Not quite… The decision we took was one of pragmatism, regulatory compliance, commercial and operational advantage and, if we do say so ourselves, pretty darn clever all in all! During Ignite’s crowdsale, willing participants will be able to purchase IGNT tokens, these being necessary to access the Ignite RATINGS platform and avail oneself of all the various benefits that entails, or may entail in the future (more on that later). As described in greater detail in the Ignite White Paper the IGNT token acts as a “proof of membership” and “proof of stake” within the Ignite ecosystem — it identifies you as a member of the Ignite HIVE and it rewards you for participating in the Ignite RATINGS ratings process. This token is purposefully designed to be non-tradeable which, among other things, allows us to push the Ignite platform out to the world without emerging regulatory pressures which might otherwise affect the token itself also having a knock-on effect on the project timeline. We live in uncertain regulatory times and this approach permits us to plan for the unknown that is impending global regulation by separating out, as much as possible, those aspects of the token that might require Ignite to introduce additional compliance measures, such as identity verification. All of this is of clear benefit to our members and contributors. In creating Ignite, we are obviously hoping that every single IGNT token purchased will be forever deployed on the platform, but to believe that this will be the case would be incredibly naive. We recognise that Ignite’s users require the ability to liquidate their holdings and, additionally, that Ignite requires a way to acquire new users and, with a capped supply, realistically this should be via the secondary market. Accordingly, we will be creating a second token — tentatively named “IGNITEX” (we’ll use that for now…) — which will be listed on exchanges, will be tradeable, but will not permit the holder to interact with the Ignite RATINGS platform (other than to repurchase IGNT), and will not confer any additional benefit on the holder. IGNT and IGNITEX will be pegged to each other and readily interchangeable, one for the other. IGNITEX will be made available to coincide with our first exchange listing and is not dependant on a full platform roll-out. Token Supplies, Market Caps and Pricing Strategies… This guy gets it! :) The creation and deployment of a segregated token strategy allows us to address another concern which, somewhat surprisingly, seems to be prevalent among potential contributors and which we did not anticipate being an issue in the slightest — that being the perception that the IGNT token is “too expensive”. This stems from the fact that the Ignite project has been pegged to ETH since its value was circa $240 and, as such, has seen the “price” per IGNT climb over the last six months to $6–7 per IGNT (0.00667 ETH) at the current rate as at the time of writing, give or take. In response to these rises, and to preserve the tokenomics of the project, Ignite has repeatedly slashed its maximum initial circulating supply: from 60,000,000 to 40,000,000; from 40,000,000 to 25,000,000; from 25,000,000 to 20,000,000; from 20,000,000 to 15,000,000; to where we are today…10,000,000 IGNT. However, the perception that the per unit price of IGNT is “too high” persists, despite there being limited supply, no minimum purchase amount and 18 beautifully-crafted decimal places for people to play with…funnily enough, even some exchanges prefer more fractionally-priced tokens as, for purely psychological reasons, they tend to attract greater volumes of trading activity. Who knew?! IGNITEX will allow us to address these perceptions and concerns by introducing a factoring element into the exchange rate between IGNT and IGNITEX which, although not part of the underlying reason for creating IGNITEX at all, is a welcome spin-off from the segregation strategy. We have not locked down this rate yet, but all things being considered, and having taken on board the views of our community, we believe that a factor of 100x should be sufficient. So… 1 IGNT = 100 IGNITEX 1 ETH = 149.925 IGNT = 14,992.5 IGNITEX 1 IGNT = $6.667 (at an ETH/USD exchange rate of $1,000) 1 IGNITEX = $0.06667 The Wrap-Up… Being an IGNT holder entitles you to participate in the ratings process and benefit from the rewards generated; it is also intended to provide access to additional benefits over time, such as preferential “early access” pricing for pre-ICO projects, introductory and/or preferential rates for partner exchanges, access to exclusive tools and features (including Ignite’s own smart-routed trading suite), the Ignite debit card, exclusive HIVE-member loyalty promotions etc. Being an IGNITEX holder entitles you to: 1) sell IGNITEX on an exchange; or 2) use it to purchase IGNT, thus gaining full access to the Ignite ecosystem, and the benefits that come with it. Discussions with exchanges for the listing of IGNITEX remain ongoing and subject to duties of confidentiality. And that, as they say, is that. Should anyone have any queries regarding our token segregation strategy, please feel free to join our Telegram Group (if not already a member), and ask your question there. Alternatively, please email info@igniteratings.com. Shameless plug to end with: Ignite is currently offering a 10% discount on purchases of IGNT made before 18:00 UTC on 29 January 2018. The token sale contribution address can be found on the Ignite website.
https://medium.com/igniteratings/ignites-token-segregation-explained-dac4b298a829
['Damon Barnard']
2018-10-01 12:10:00.613000+00:00
['Ethereum', 'ICO', 'Cryptocurrency', 'Bitcoin', 'Blockchain']
The Night of the Severed Chords
Light footfalls proceeded in rhythmic unison along the corridor. The carpeted floor displayed multi-colored, abstract, symmetrical patterns. The pair of feet traveling upon it made their way to a shut door. Pausing only a moment, the feet shuffled in as the door was flung open in silence. It closed behind them. The feet belonged to a slender Japanese woman. The inner room was dark, but filled with an exciting presence of outstanding sound. It was a gleeful atmosphere. The place felt alive with it. A play was being put on. Cigar smoke hung on the air, and the entire mini-theatre was wholly unoccupied but for the exception of a single individual. The attentive viewer lay seated, his head and torso but a seamless silhouette against a contrasting background. On the stage, three marionettes were engaged in acting out a scene. The skinny form of the young lady darted from the back of the room and over to the man who was watching the tale unfold. The smoke insulted her sinuses as she found the pastime rather distasteful, but she did not permit her disgust to manifest itself either on her face or in her mood. Hatsuyo always followed the customary protocols for respect while in service, and she had a personal admiration for her master. But this matter was of the utmost significance. In none of her years as a private servant had she encountered anything quite like it. She was now a humble messenger, but she always offered her best self to him in this simple task assigned to her. “Sumimasen kyō. Excuse me, sir,” Hatsuyo whispered to him in the most soothing tone imaginable, “This note was slipped under the door of your office. I found it while cleaning.” Bowing, she placed a small square of paper in his palm and stepped back. She seemed nervous. Her swaying posture, her anguished facial expression — the man had never seen her like this before. The cigar was extinguished. He read the message which had come to him on such peculiar terms. The notes of silken string instruments and the soft dialogue of the rival puppet characters supplied an appealing yet distracting ambience. The scribblings took merely a few seconds to digest and swallow. His eyes looked up and landed on the woman who had presented this to him. He had no doubts as to her loyalty. Though her voice had soothed him, the message she carried sent a convulsion down his frame, his neck muscles tightening, goose flesh sprouting and spreading over the exposed skin of his arms. The note had been written in perfect English and read as follows: Gen. Guftason, Since I see myself as a person of fair play and wish to maintain my code of ethics, I must warn you that your life is in jeopardy. You are the target of an assassination to which I confess to be the coordinator. Your very breaths are numbered. I advise you not to stick your neck out for anyone. Otherwise, your neck’s chords and vertebrae shall be sliced off of your body. Remember, the General is always a most desirable target. No signature or farewell statement. The man didn’t expect any. He got the feeling the author of this death threat was a rather pompous, egotistical person. He envisioned this fellow to be a warped character who experienced some sick elation from his sadistic game. The General could see this sort of personality flow out of the context of the foreboding note. The writer had used the term “fair play,” for instance. The whole letter exuded an unmasked air of condescension and high self-esteem. Amid his sensory analysis and imaginary mental drawing, the message itself began to sink in. The alarming, abject situation put a weight on the man, but the retired General’s physical stance barely altered. His shoulders were motionless, no tenser than usual. His countenance, however, did experience a detectable change. Hatsuyo watched as the smile broke, and the entertained eye widened in distress, brow cocked and shifted upward. Presently, there came a noise like the twang of strings snapping followed by a fleeting blur of scarlet. Something fell on the stage. The blur had just shot past the General. It landed in the wall beyond. A dull thud echoed in the theatre, and one of the woman puppeteers shrieked. General Guftason’s head went spinning, first to the wall then to the stage where the unidentified flying object had come from. He discovered a curved metal blade embedded in the wall. Yanking it from its self-made notch, he looked the weapon over. It had been disguised as an unfolded hand fan of the traditional Oriental make. He examined it closely. This assassination coordinator — and whoever his inside man is — seem to have good taste, the middle-aged General thought to himself, no matter how insane they may be. The bladed fan was a rather ornate piece of workmanship. The paper was decorated with hand-painted designs. There, on a solid royal red background, was a detailed depiction of a battle sequence. It was of an archaic war fought with antiquated weapons. This was a war which above all else had functioned on muscle and metal. The key figures in the micro-painting were the leaders of two opposing armies locked in a duel, each engaging his opponent with a Guntō, a type of curved saber. The highlight of the picture was the decapitation of the one warmonger on the right by the one on the left. Guftason understood the symbolism immediately. For all extensive and militaristic purposes, these two great soldiers represented two generals clashing against each other. Evidently, he was the one with the head rolling and the bloody fountain spouting from the neck where the head had been severed. A crude, mellow-dramatic method of bringing one’s point across. It was meant to instill a sense of anxiety and dread. And it has struck its target, Guftason thought again. Next, he made his way to the stage. Everyone watched on now in silence. What had the big commotion been for? Looking down, he saw the puppeteer who had screamed holding one of the glorified dolls in her arms. The body of the marionette had been sewn together into a single form with no loose attachments. Now, in mid-performance, the puppet had violently had its head sliced off, along with a number of the wires used to manipulate the thing’s movements. The bladed fan had swooped out from somewhere behind the platform. None of the puppet masters knew how it could have happened. No one saw anything. Quite convenient for an assassin. Guftason stared at the broken puppet, its head still laying on the wooden floor. Ironically, this was no ordinary character. It had represented the historical personage of Sai, a revered military leader of ancient Japan hailed as one of the Five Kings of Wa, and bearer of the title “General Who Maintains Peace.” This play that had been written within the past few years had included him as a character. Of course, it did not go down quite like this in the script.
https://medium.com/lit-up/the-night-of-the-severed-chords-3018cd4956c3
['John Tuttle']
2019-06-07 13:09:01.801000+00:00
['Romance', 'Literature', 'Fiction', 'Short Story', 'Thriller']
Burning Fat Vs Burning Calories
To lose weight and get in shape you must have a good diet and exercise regularly to burn fat. The first thing you must understand about exercise is that just because you are burning calories does not mean you are burning fat. Your main focus when you exercise should be losing body fat, and you can’t lose body fat just from burning calories. When we exercise, our bodies will start burning calories, but the calories that are burned are the calories from carbohydrates in our system. In order to burn calories from your stored fat, your body requires the presence of oxygen. There is a certain amount of oxygen that your body needs in order to start burning fat and the only way for you to measure the amount needed for your own body is to keep up with your target heart rate during exercise. Please understand that if you continue to only burn calories from carbohydrates, you will lose mostly “water weight” which leads to a decrease in your metabolism. Also, think of the calories that are burned from carbohydrates as your energy calories. If you lose too much energy calories then your muscles will not receive enough energy to increase your metabolism which indirectly burn fat. Therefore you must increase your calorie intake when you are on an exercise program to replace your burned energy calories. One Simple Way To Maintain A Healthy Digestion! Burning Fat Calories during exercise During aerobic exercise, your body goes through several stages before it reaches the point where you are burning fat. You will hear people say that you are only burning sugar (carbohydrates) not fat during the first 10 minutes of exercise. This is true to a certain extent. I say this because you will continue to burn sugar past the 10 minute mark if you are not working out hard enough for your body to want more oxygen; or you are working out too hard and you can’t supply your body with enough oxygen for fat burning. When you exercise you must move at a steady pace (not too fast, not too slow) so your body will utilize your stored fat (not carbohydrates or sugar) as its energy source. Also remember that just because you reached the fat burning stage does not mean you will stay there. Staying at the fat burning stage once again depends on if you are moving at a pace that is right for your body. Make sure that you are within your target heart rate range. One Simple Way To Maintain A Healthy Digestion! Burning Fat Calories at rest The only way for you to continue to burn fat calories hours after you have finished working out is through the anaerobic exercise of weight training. Weight training is the key to burning fat at rest. Weight training is an anaerobic activity that will cause you to burn more calories than aerobic exercise. The calories that you are burning during weight training exercises are mostly calories from carbohydrates (meaning you must eat even more calories per day for energy); but the calories you burn at rest are mostly calories from fat. The reason you are burning fat at rest is because weight training increases your metabolism which uses your stored fat as energy. To make your body the ultimate fat burning machine you must do aerobic (cardio) and anaerobic (weight training) exercises. One Simple Way To Maintain A Healthy Digestion!
https://medium.com/@pawanactor20/burning-fat-vs-burning-calories-685f00800a14
['Pawana Kumar Verma']
2021-12-10 19:54:00.674000+00:00
['Wight Loss', 'Weights And Measures', 'Weightloss Recipe', 'Weightloss Foods', 'Weight Loss Tips']
Is CBD Isolate Powder A Life Changer?
CBD Isolate powder continues to gain popularity, both from a consumer and producer point of view. The sophistication of scientific research and science today has proved the multiple qualities and health benefits provided by this valuable cannabinoid. It is quite natural that with the rise in demand, CBD producers create innovative and unique ways of consuming the compound. If you have gone CBD shopping, you may have come across different varieties of products such as CBD oil, CBD edibles such as cakes, juices, or salads, CBD tinctures, or even CBD soap. What is even more interesting is that every CBD product is not the same, and while one type of CBD can have a potent and lasting effect on some types of conditions, other kinds of CBD may have a completely different influence. What is CBD Isolate Powder? CBD can either be CBD Isolate, full-spectrum, broad-spectrum. CBD Isolate is definitely the purest version of the compound, which usually comes in the form of a crystal, with 99% pure CBD. In a nutshell, no extra plant materials such as chlorophyll, oils, or waxes can be found in CBD Isolate. The process behind the production of CBD Isolate is pretty complex and lengthy. It is derived through the process of extraction, and there are multiple extraction methods available for the cannabis plant, depending on the end product the producer is looking for. In the case of CBD Isolate, the extraction begins as a standard procedure, which results in CBD mixed with plant material. To further purify the CBD, the substance is filtrated, and any additional residue is cleaned out of the extract. Later, the CBD oil, which is left, goes through winterization to additionally clean the product of any remaining wax or other material. After the extraction process, producers have 99% pure CBD. CBD Isolate powder is among the most potent versions of CBD, which has an impact on the suggested doses for consumers. The other forms of CBD may require higher intake doses to get a result, but PHB Frost 500 mg CBD should be consumed in lower quantities. Why Consume CBD Isolate? There are controversial opinions regarding the usage of CBD Isolate powder over whole-plant products. One of the chief reasons behind choosing consumer CBD isolate is that no THC will enter your body. THC is the high-inducing substance found in cannabis, which leads to the “high” effect. For instance, in the United States, products with THC levels less than 0.3% are only considered legal. In that sense, CBD Isolate powder, such as PHB Frost 500 mg CBD is a preferred product, especially when it comes to helping children and the elderly. Health benefits of CBD Isolate Powder The health benefits of PHB Frost 500 mg CBD are the same as those of CBD but with a higher concentration of the compound. CBD Isolate powder can be successfully used for the following: Reducing vomiting and nausea Lowering inflammation because of its anti-inflammatory properties Improving mood and overall wellness Conclusion CBD Isolate powder like PHB Frost 500 mg CBD can be consumed in a range of different ways, including eating, dabbing, making a home-made tincture, or rubbing into your skin. This compound allows consumers to be as creative as they desire regarding the consumption method. There is, in fact, no evidence of overdosing on CBD. This means that you won’t have to worry about your intake. Where should I buy CBD oil from? CBD industry is unregulated. This has given rise to many companies who provide fake products. We want you to be healthy and safe while using CBD oils. We use CO2 extraction to guarantee purity. Our products are not just natural but also 100% safe. Not sure where to find CBD? Try CBDOILS.com for high-quality CBD oils!
https://medium.com/@m-sarahcarlos/is-cbd-isolate-powder-a-life-changer-37f2fc93401d
['M Sarahcarlos']
2020-06-15 06:03:54.996000+00:00
['Cbd Isolate Powder', 'Pure Cbd Crystals', 'Phb Frost Cbd', 'Cbd', 'Cbd Isolate']
FLYING TRAIN or TUBELESS HYPERLOOP— The Future of Transportation?
Since early times, we have sought ways to make travelling faster and more convenient. We invented the wheel, carts and wagons, steam power, and the internal combustion engine. Then, innovations continued with electric cars, bikes and driverless vehicle, hyperloop etc. Ideas that seemed to belong only to the realm of science fiction are being made a reality. In the next 30 years, though, we are likely to see more change in transportation technology than we’ve seen in the last 100 years. Here is a preview of what’s coming in the world of transportation. Ringway Transportation, this futuristic mass transportation designed and patented by Naveen Chaudhary in year 2012. The principle of the present invention developed from the observation of the relationship between the balance points of the beam at rest on two points. It is noted that when a small beam sliding from the edge of a table or wall, given a uniform weight distribution, the beam will be pushed to almost half of its length before tilting occurs. The present mode of transportation is named as ‘Ringway’ transportation system and it is defined as the means of the transportation system in which the an aeroslider vehicle glides and moves in the linear direction with the use of a direct or indirect source of thrust between the ring/frame supported by a continuously aligned plurality of pillars. The present invention solves the problem congestion, carbon, cost, safety, efficiency and speed. There is a need for a revolution in the transportation system to deal with interrelated and surmountable above problems. The Ringway transportation system maintains the characteristics of traditional rail transportation and air transportation but reconfigures them in a highly innovative design. The design employs that the Steel wheels are replaced by the chain drive wheel and continuous steel rail is replaced by a mover or magnetic levitation components fixed on the ringway pillar support and on the vehicle. The functionality of the pillar support and vehicle is combined. It is accomplished by reversing the orientation of the wheel and the rail. The vehicle is driven by a motor connected to the chain drive wheel assembly placed at the bottom end of the vehicle or by using magnetic levitation. A mover fixed in the frame of Ringway pillar support will stick in between the chain link and tends to move the vehicle in the direction of rotation of the motor. By placing the roller permanently on a fixed structure of Ringway pillar support and the deflection plate on the strengthened vehicle, will help to turn the vehicle at certain curve or elevations. Ringway transportation system uses the principle of the cantilever beam, a projection anchored at one end. Since the vehicle is moving within and is held by the support frames, the effect is to create a cantilever beam. The beam is fixed in its vertical orientation, but the motion is in its horizontal orientation. This futuristic transportation system is to operate at supersonic speed by utilising the magnetic levitation system and introduction of Magnetoplasmaionic (MPI) Engine, which is a free energy system, introduced to use the earth’s atmospheric energy ions and plasma to convert into direct or indirect from of the power supply, which provides thrust to the vehicle of the present invention. Ringway Transportation System estimates that 80% of total system costs are related to the components that are both standardised and mass-produced by jet or aeroplane or railways or maglev components manufacturing industries. It is only the footer and electrical installation that is site-specific. Since the system is essentially 80% air space, there is simply less material involved. The use of non-esoteric technology as all systems can be built upon existing know-how and the ability to pre-fabricate and mass-produce the component. The utilisation of air space and the elimination of continuous rail results in reduced land use. There is minimal disruption of existing infrastructure during construction and operation. These cost savings can be as great as 70% when compared with any existing rail or aeroplane or hyperloop alternatives. Ringway transportation system applications include recreation and people-moving, urban transit, high-speed land transport, supersonic air transport and military transport. Climate change has now become an international priority as world urbanises at a rate unparalleled in human history and this transportation system is zero carbon and free energy means of transportation.
https://medium.com/@ringwaytech/flying-train-or-tubeless-hyperloop-the-future-of-transportation-7213f985a023
['Ringway Transportation']
2020-12-23 17:06:40.014000+00:00
['Future', 'Hyperloop', 'Future Technology', 'Inventions', 'Transportation']
Ecopsychology Can Save You And Your World
Ecopsychology Can Save You And Your World Einstein may have been a sexist jerk, sometimes, but he was mostly right about your place on the planet Stockphoto image Ecopsychology says that what you breathe, eat, live in and work with, all comes from your environment. It also says that you have a body that feels, thinks, emotes, and has needs dependent upon your environment. To me, any other psycho therapy is close to gibberish. How your Elektra or Oedipal complex originated doesn’t affect you as much as your next breath. What your Jungian, archetypal, tracking charts show you doesn’t matter as much as access to clean water. What birth order, or primal screaming, or toddler recall hypnotherapy, reveals about that bad day when you were six, is not as crucial to your life as a view of nature that shows you the world is supportive, beautiful, healing, inspiring, and resilient. We need biodiversity to survive. We cannot live without oceans, or insects, or food sources, or hope. We can’t live online, at least not entirely, until we are smarter, faster, cyborg robots. Even then, the raw materials have to come from whatever planet is closest and convenient. The human tribe is ONE tribe, one race. Further, that race itself belongs to an even larger family, our shared DNA throughout the planet. Ecopsychology is about hope. At its core, ecopsychology is optimistic. It tells us that you belong to the world. You have a place in the human tribe. The human tribe is ONE tribe, one race. Further, that race itself belongs to an even larger family, our shared DNA throughout the planet. The survival of our race depends not upon conquest, superiority, or domination. In fact, it depends entirely upon sharing, caring, and increasing our circle of compassion, ever outward: to include not just our fellow human beings, but the very micro to macro organisms that make all human beings — and all life — possible. Being of one race in such tumultuous racist, and sexist times, may sound unrealistic, or naive. There are people who will constantly whine about “Us versus Them.” Those people, some are supremacists, some are sexist, and some are just bitter and impoverished. They may feel threatened by the possibility of losing “our way of life”. They may tell you it is because we “can’t even take care of our own.” They hear the rabble at the city gates and lose track of the fact that people are people. We can’t expect refugees to just evaporate, because the trickle and continuing, fall out of Climate Change, and possible Nukes, is just beginning. Those people are also we people. Those of us who feel fear about the “out-group” are, as Einstein so eloquently put it suffering, “under a delusion.” Here is Al’s entire quote on your belonging to nature via your physical, mental, emotional, spiritual ecopsychologically wired person-hood: “A Human being is part of a whole, called by us “universe,” limited in time and space. He/She*experiences the self, our thoughts and feelings, as something separated from the rest — a kind of optical delusion of our consciousness.” In other words, it is up to all of us who do recognize our racial, and gender equality, one-ness to get the memo circulated to all of those suffering under the sad thought that “some of us don’t belong.” We all belong. Another reason ecopsychology is of infinite value is that it does not focus on the doom and gloom of our total trashing of our planet. It looks instead, at your personal, perpetual power. You can choose empowerment of reusable bags. You can choose the empowerment of buying less plastic crap. You can choose the empowerment of planting a tree. You can choose any number of influential habits that you exercise hundreds of times in a day. You have some say, with each and every choice you make. You don’t need more stuff and complication in your life so much as you need to reconnect to the pre-industrial relationship our species evolved within. You need to examine your values, and when people do, they realize that cooperation is always better than competition. We invented money, and nations, and borders and the one percent. How do your values say that is working out for you? Why invest in ignorant, outdated, systems that don’t benefit your life? A system that only works for a few, while fouling the biosphere elsewhere, does not bode well for an entire species. We are a social species, and as such, we need one another. Sociobiology says so. Common sense says so. Your being in the world — ecopsychology — says so. In an era of climate change, corrupt capitalism, droughts, famine, floods, fires and refugees, taking more than your share doesn’t make you a prince, or a genius, or a rich, celebrity, role-model looked up to by the fawning serfs. It just makes you an inconsiderate jackass. However, bragging about owning an oil well, or coal mine, with particular influence in Washington DC, due to lobbying influence, doesn’t make you an inconsiderate jackass. That just makes you a tone-deaf, and foolish, inconsiderate jackass. More optimism of ecopsychology is found in the woods. Or the mountains, or the prairies, or the oceans — white with foam. Beauty, and diversity as nature teaches it to our race is remarkable, rejuvenating, remedial, refreshing, rewarding, reconnecting, and reinvigorating. And that is just a few of the “R” words. Actually, the more you know about nature’s powers, the better human being you are because you realize both the significance and insignificance of your existence. You can look up at the sacred, stars at night, you can see the forest, for the trees, and you can hear the piper calling you to join him. (We’re so sorry Uncle Albert, we know your 21st century, updated version would likely include as many gender pronouns that are considered polite to the enlarged circle of compassion.)
https://medium.com/thrive-global/ecopsychology-can-save-you-and-your-world-4162af0c40ae
['Christyl Rivers']
2019-04-01 16:11:04.550000+00:00
['Equality', 'Belonging', 'Wisdom', 'Climate Change', 'Social Justice']
Apache Spark vs. Sqoop: Engineering a better data pipeline
As a data engineer building data pipelines in a modern data platform, one of the most common tasks is to extract data from an OLTP database or data warehouse that can be further transformed for analytical use-cases or building reports to answer business questions. Apache Sqoop quickly became the de facto tool of choice to ingest data from these relational databases to HDFS (Hadoop Distributed File System) over the last decade when Hadoop was the primary compute environment. Once data has been persisted into HDFS, Hive or Spark can be used to transform the data for target use-case. As adoption of Hadoop, Hive and Map Reduce slows, and the Spark usage continues to grow, taking advantage of Spark for consuming data from relational databases becomes more important. Before we dive into the pros and cons of using Spark over Sqoop, let’s review the basics of each technology: Apache Sqoop is a MapReduce-based utility that uses JDBC protocol to connect to a database to query and transfer data to Mappers spawned by YARN in a Hadoop cluster. When the Sqoop utility is invoked, it fetches the table metadata from the RDBMS. If the table you are trying to import has a primary key, a Sqoop job will attempt to spin-up four mappers (this can be controlled by an input argument) and parallelize the ingestion process as it splits the range of primary key across the mappers. If the table does not have a primary key, users specify a column on which Sqoop can split the ingestion tasks. Without specifying a column on which Sqoop can parallelize the ingest process, only a single mapper task will be spawned to ingest the data. Basic Usage: sqoop import — connect <jdbc-url> — username <username> — password <password> — table <table-name> — target-dir <destination-hdfs-location> For example, to import my CustomerProfile table in MySQL database to HDFS, the command would like this - sqoop import --connect jdbc:mysql://db1.zaloni.com/customer --username ngoel --password xxxxx --table CustomerProfile --target-dir /customer/customer_profile If the table metadata specifies a primary key or to change the split by column, simply add an input argument — split-by sqoop import --connect jdbc:mysql://db1.zaloni.com/customer --username ngoel --password xxxxx --table CustomerProfile --split-by customer_id --target-dir /customer/customer_profile For further performance tuning, add input argument -m or — num-mappers <n>, the default value is 4. To only fetch a subset of the data, use the — where <condition> argument to specify a where clause expression, example - sqoop import --connect jdbc:mysql://db1.zaloni.com/customer --username ngoel --password xxxxx --table CustomerProfile --target-dir /customer/customer_profile --where “state = ‘WA’” For data engineers who want to query or use this ingested data using hive, there are additional options in Sqoop utility to import in an existing hive table or create a hive table before importing the data. sqoop import --connect jdbc:mysql://db1.zaloni.com/customer --username ngoel --password xxxxx --table CustomerProfile --target-dir /customer/customer_profile --hive-import --create-hive-table --hive-table myHiveDB.CustomerProfile Apache Spark is a general-purpose distributed data processing and analytics engine. Spark can be used in standalone mode or using external resource managers such as YARN, Kubernetes or Mesos. Spark works on the concept of RDDs (resilient distributed datasets) which represents data as a distributed collection. Spark engine can apply operations to query and transform the dataset in parallel over multiple Spark executors. Dataframes are an extension to RDDs which imposes a schema to the distributed collection of data. Dataframes can be defined to consume from multiple data sources including files, relational databases, NoSQL databases, streams, etc. Let’s look at a how at a basic example of using Spark dataframes to extract data from a JDBC source: Creating dataframe Performance Options Similar to Sqoop, Spark also allows you to define split or partition for data to be extracted in parallel from different tasks spawned by Spark executors. ParitionColumn is an equivalent of — split-by option in Sqoop. LowerBound and UpperBound define the min and max range of primary key, which is then used in conjunction with numPartitions that lets Spark parallelize the data extraction by dividing the range into multiple tasks. NumPartitions also defines the maximum number of “concurrent” JDBC connections made to the databases. The actual concurrent JDBC connection might be lower than this number based on the number of Spark executors available for the job. Filtering data Instead of specifying the dbtable parameter, you can use a query parameter to specify a subset of the data to be extracted into the dataframe. val df = spark.read.format(“jdbc”) .option(“url”,”jdbc:mysql://db1. zaloni.com /customer”) .option(“driver”,”com.mysql.jdbc.driver”) .option(“query”, “select * from customer.CustomerProfile where state = ’WA’”) .option(“user”,”ngoel”) .option(“password”,”xxxxxx”) .load() Persisting data to FileSystem or database Once the dataframe is created, you can apply further filtering, transformations on the dataframe or persist the data to a filesystem including hive or another database. df.write.saveAsTable(‘customer.customerprofile’) OR Now that we have seen some basic usage of how to extract data using Sqoop and Spark, I want to highlight some of the key advantages and disadvantages of using Spark in such use cases. When using Sqoop to build a data pipeline, users have to persist a dataset into a filesystem like HDFS, regardless of whether they intend to consume it at a future time or not. With Spark, persisting data is completely optiona l. Users can write Spark jobs to perform the necessary filtering/transformations or build analytical models on the dataframes created on the JDBC source and only persist the transformed data to their target system if needed. Based on the use case, users can also choose from an extensible list of target systems to persist the transformed data, including other relational databases, NoSQL databases, streams, and file systems (out of the box support for common file formats such AVRO, Parquet, JSON, etc.) l. Users can write Spark jobs to perform the necessary filtering/transformations or build analytical models on the dataframes created on the JDBC source and only persist the transformed data to their target system if needed. Based on the use case, users can also choose from an extensible list of target systems to persist the transformed data, including other relational databases, NoSQL databases, streams, and file systems (out of the box support for common file formats such AVRO, Parquet, JSON, etc.) Data engineers may want to work with the data in an interactive fashion using Jupyter Notebooks or simply Spark Shell . This is a key advantage over Sqoop, which only submits a MapReduce job and does not let users interactively work with the data extracts and decide if they want to persist the extracted data to the filesystem. fashion using . This is a key advantage over Sqoop, which only submits a MapReduce job and does not let users interactively work with the data extracts and decide if they want to persist the extracted data to the filesystem. Apache Spark can be run in standalone mode or optionally using a resource manager such as YARN/Mesos/Kubernetes. This presents an opportunity for data engineers to start a lightweight transient Spark cluster in an environment of choice and shut down after the task is done and not have to compete for resources in a typical long-running Hadoop cluster. in an environment of choice and shut down after the task is done and not have to compete for resources in a typical long-running Hadoop cluster. Many data pipeline use-cases require you to join disparate data sources. For example, what if my Customer Profile table is in a relational database but Customer Transactions table is in S3 or Hive. Using Spark, you can actually run Federated data queries by defining dataframes for both data sources and join them in memory instead of first persisting my CustomerProfile table in Hive or S3 Next, I will highlight some of the challenges we faced when transitioning to unified data processing using Spark. In the Zaloni Data Platform, Apache Spark now sits at the core of our compute engine. Data engineers can visually design a data transformation which generates Spark code and submits the job a Spark Cluster. One of the new features — Data Marketplace enables data engineers and data scientist to search the data catalog for data that they want to use for analytics and provision that data to a managed and governed sandbox environment. ZDP allows extracting data from file systems such as HDFS, S3, ADLS or Azure Blob, relational databases to provision the data out to target sandbox environments. Apache Spark drives the end-to-end data pipeline from reading, filtering and transforming data before writing to the target sandbox. Some of the challenges we faced include: Data type mapping — Apache Spark provides an abstract implementation of JDBCDialect, which provides basic conversion of SQL data types to Catalyst data type. There are out of box concrete implementations for many popular databases but there might be a need to extend or create new implementation based on the use case and database. Performance tuning — As described in the examples above, pay attention to configuring numPartitions and choosing the right PartitionColumn is key to achieving parallelism and performance. When persisting data to filesystem or relation database, it is also important to use a coalesce or repartition function to avoid writing small files to the file system OR reduce the number of JDBC connections used to write to target a database. In conclusion, this post describes the basic usage of Apache Sqoop and Apache Spark for extracting data from relational databases along with key advantages and challenges of using Apache Spark for this use case. In the next post, we will go over how to take advantage of transient compute in a cloud environment.
https://medium.com/dataops-playbook/apache-spark-vs-sqoop-engineering-a-better-data-pipeline-ef2bcb32b745
['Nikhil Goel']
2020-03-03 12:35:14.309000+00:00
['Sqoop', 'Jdbc', 'Spark', 'Spark Dataframe', 'Data Engineering']
Kduka User Guide
Welcome to Kduka. You can use this as your guide or visit kduka/help for a full-on documentation. Signing up and Creating a Store To get started on Kduka, visit the sign up page on http://www.kduka.co.ke/ and click on Let's start button. Required fields: Store Name - This is the name of your Store. Your Visitors will see this if you dont upload your logo. - This is the name of your Store. Your Visitors will see this if you dont upload your logo. Your Store Website - This is the address that you would like to have for your store. It shouldnt have any spaces or special characters. They will be stripped automatically. i.e If you fill out johndoe your clients will visit your store through johndoe.kduka.co.ke - This is the address that you would like to have for your store. It shouldnt have any spaces or special characters. They will be stripped automatically. i.e If you fill out johndoe your clients will visit your store through johndoe.kduka.co.ke Email - This is the email you will use to login to your store with. stores have separate logins to ensure independence and proper management of each store. It cant be the same for two stores Click on the Next button to continue filling the form Phone Number - This is the phone number that will be displayed on your store. it must be in the format 2547XXXXXXX e.g 254723232323 - This is the phone number that will be displayed on your store. it must be in the format 2547XXXXXXX e.g 254723232323 Password - The password you will use to login into your store. It must be secure since your store manages your earnings. - The password you will use to login into your store. It must be secure since your store manages your earnings. Repeat Password - Repeat the password you typed After filling in the above details, click the create store button to create your store. You will be required visit your email address for the account activation link. The link directs you to a login page. Enter your credentials to access your dashboard. Managing your Store Setting up delivery Options Before you can activate your store, you need to have at least one delivery option set. You will have an option of three delivery methods collection point - Specify a collection point where clients can pick up their goods - Specify a collection point where clients can pick up their goods manual delivery - Create a range of locations where you can send the package to them and specify the costs - Create a range of locations where you can send the package to them and specify the costs automated delivery - We have provided a way to use third party (sendy) delivery services who will pick the item from you and deliver to the client. For now, we want to get the store up and running so we will specify a specific collection point, where the client will pick the goods Click on “Save Delivery Settings” Activating Store Once you have set the delivery options, you can proceed to activate your store. To do this, click on the activate store button. Once the page refreshes, you can visit your store’s site by clicking on the address you specified. You can also type in the address of your store on the browser to access it. Voila!! you just setup your store on KDuka! Check the next section to see how you can add categories, products and customize your store! Adding a category Ideally, the first thing you’ll create is a category that will describe a group of products. On your dashboard side bar, go to Categories > Add Category Name - This is the name of the category, it can't be empty - This is the name of the category, it can't be empty Description - Add a description for that Category - Add a description for that Category Make Category Active - Have this checked so that it can be displayed on your store, otherwise it won't. Click on Save Category to save it on the store Featured Categories Featured categories are categories that you want to appear on the homepage showing your latest product from that particular category. To set it, simply click on the green option displayed on the category “Make Featured”. Adding a product With at least one category, you can go ahead to add products. One category can have multiple products. Navigate to your dashboard’s sidebar, select Products > Add Product Required Fields: name - The name of what you're selling i.e Orange T-Shirt - The name of what you're selling i.e Orange T-Shirt price - The price of the product - The price of the product category - Select a category from the categories you had created - Select a category from the categories you had created number of products in stock - This is the number of units you have in stock for that products - This is the number of units you have in stock for that products description - Describe your product and add any extra information you would like your client to see - Describe your product and add any extra information you would like your client to see show product in store - If you want your items to appear on the store , have this checked. If you don't want to sell it yet, leave it unchecked Optional Fields: * Show the price before discount - Have this selected if you are offering a discount and the new price is different from the initial one. Price before discount - This is the initial price of the product, before offering a discount on it. - This is the initial price of the product, before offering a discount on it. Dimensions - These are important when you want to use our third party delivery services - These are important when you want to use our third party delivery services length - This is the length of the product in centimeters - This is the length of the product in centimeters width - This is the width of the product in centimeters - This is the width of the product in centimeters height - This is the height of the product in centimeters - This is the height of the product in centimeters weight - This is the weight of the product in kilograms - This is the weight of the product in kilograms product images - This are images of the products that make it visual for the customers. You have an option of uploading up to 5 images for your product. Once you have all the details filled, click on the create product button. You can go ahead and add more products. The next step is picking a theme. Themes Themes are used to change the appearance of a store. We are constantly working to ensure that you get a good variety of themes to give your store that desired unique look. On the sidebar, navigate to Themes. You will find 4 options you can use to customize your store; Elite store, Modern, Electronic store and CStore. Choose Layout To pick a theme, click on the one you want. You will get an overview image of it’s appearance. To exit the image, click on close or the X at the top. With selected theme (whose radio button is checked), click on save layout details button at the bottom. Choose Theme Color At the top, next to choose theme button, you will see the choose theme color button. Click on it. Your can set your own theme colors to be used in your website. You will be presented with a color picker where you can select or input the color code of the color you want. Results in this kind of effect on your store: If you don’t want to change your fonts, you can go ahead and save layout details else, click on the choose store font Choose Store Font This allows you to select the text font you want for your store. Once you have picked your store font, click the Save layout Details button. Visit your store to see whether the outcome is satisfiable, if not, you can go back and make the necessary changes. Store settings Pages settings We give you an option of having two pages in your website. The home page, and the about page. The products and contact us pages are default. On your dashboard sidebar, click on store settings then ‘Pages’ You can check the ‘Enable home page’ checkbox to enable people to view your homepage, or uncheck it to remove the homepage option If you enable the homepage, its important to upload your own banner image for your website. You can do this by clicking the button under Banner Image , selecting your banner and clicking on the “Save Pages Button” Featured Categories and Your latest products will be viewed on the homepage You can also check the “Enable about page” checkbox to enable the about page. You can give information about your website and business here. Clicking on the “Save Pages Button” saves your changes NB: If you don’t enable the homepage, the first page a customer will view when they visit your website will be the about page. If both home page and about page are disabled, the first page will be the All Products page” Store settings These are the general setting of your store; including name of the store, store url, slogan, logo and a choice whether you want to display it, display email, phone number and business location. Once done, click on the save store settings button Social media settings This is where you add links to your social media accounts (facebook, twitter, instagram, linkedin, pinterest, youtube and vimeo). This way you can share products directly. Once you add the applicable links, click on save social media settings button. Delivery settings Before you activated your store, you were required to set atleast one delivery option. Here is a more detailed elaboration on how to go about it. Manual Delivery This option is used when you have set locations and their prices. You will initiate this delivery manually after an order has been made On the Sidebar, navigate to Store Settings > Delivery Settings You can enable this options by checking the “Enable manual delivery” checkbox Type any instructions you want to give to the client — These will be the instructions that you have for the client, If there are any, or information you want to provide to the clients Add a delivery location - Specify the area of delivery you want to add. - Specify the area of delivery you want to add. Price - Type in the price for that area. Click on add Option. Automatic Delivery This option makes use of a third party to immediately deliver goods to your clients. Head over to Sendyit.com and sign up. They will give you a username and key which you will use next Type pickup point for goods — Ideally you will set a location where your goods are locate i.e a business location/ warehouse Sendy Username - This is the username provided by sendyit - This is the username provided by sendyit Sendy Key - This is the api key provided by sendyit Click on “Verify and save sendy credentials” button. If your credentials are correct, you will recieve a success message as shown above. Congratulations you have setup automatic delivery. NB: You need to have sufficient funds in your Sendy account in order for clients to automatically request that method Activation Settings After creating you store, you had an option to activate it. If by any chance you have a reason to deactivate it, click on store settings > activation settings > Deactivate store button. Note: Your store will no longer be visible/ accessible to your customers. They will view an error page instead. Password Settings This allows you to change your account password and email Email - This is autofilled with the email you used to create the account. To change, just edit it. - This is autofilled with the email you used to create the account. To change, just edit it. Current Password - This is the password you are using currently. - This is the password you are using currently. Password - This is the new password that you want to start using. - This is the new password that you want to start using. Password Confirmation - This is a repeat of the new password you are setting. Coupons This is a voucher that grants a customer a discount on a particular product(s). Create a coupon On the sidebar of your dashboard, click on coupon then create a coupon. You will have two options; percentage discount or fixed amount discount. Choose the one that suits you and fill out the details Code - This is the unique identifier of the coupon that you give to customers - This is the unique identifier of the coupon that you give to customers Number of times - This is the number of times the holder can use that coupon. - This is the number of times the holder can use that coupon. Percentage - This is percentage of the total amount that the holder is granted off. - This is percentage of the total amount that the holder is granted off. Amount - This is exact amount granted off. - This is exact amount granted off. Expiry date - This is the date the coupon expires or becomes invalid. - This is the date the coupon expires or becomes invalid. Active - This is a checkbox to activate your coupon. Once you have filled all the required details, click on Create coupon button. In addition, if you want to create many coupons at once, click on the create multiple coupons option. Manage coupon To view all your coupons, click on Coupons > Manage coupons. You can also add a new coupon from this page by clicking on Add New Coupon button. Funds This is the best part yet, finances. The funds page shows you the amount of money you have and your transaction history. Actual balance : This is the amount credited to you before verification. Available balance : This is the amount of money you can access. Transfer of funds : This allows you to transfer funds to a mobile number. Transaction History : This shows you all the transactions that have taken place on your kduka store. If you want to view the entire history, click on the View Entire History button. Dashboard (Overview) Once you have populated and added your categories, products and have transactions, this is what your dashboard will look like.
https://medium.com/kduka/kduka-user-guide-8b8b9c6ceb16
['Mercy Kinoti']
2018-08-20 12:39:11.789000+00:00
['Kduka', 'Documentation', 'Ecommerce']
Winter Sports: What to Expect
With an increasing number of cases due to the possible second wave of COVID-19, ice skating rinks and ski and snowboard slopes already have plans in place to ensure both the safety of their patrons and staff and that fun can still be had this winter. “There is going to be social distancing,” said Randi Gass, Program Specialist at Victor Constant Ski Area in West Point. “We’re trying to eliminate touch points.” Face coverings will be required upon arrival to Victor Constant at all times, especially when inside all buildings and locker rooms. Kids taking snowboarding lessons at Victor Constant Ski Area last winter. — Photo courtesy of Christine Retcho. A notable change that goes into effect this winter will be their rental system. Victor Constant is recommending that patrons arrive in their full gear, but if they choose to rent, they recommend purchasing a seasonal rental so that the equipment will be ready for them upon arrival. At Kiwanis Ice Arena in Saugerties, equipment rental, mainly skates, is prohibited for the time being. According to their COVID-19 operating procedures listed on their website, “ALL players MUST enter the Kiwanis Ice Arena completely dressed (minus: helmets, skates and gloves).” Kiwanis is also encouraging patrons to wear face coverings at all times while inside and are only requiring them when six-foot social distancing cannot be maintained. The facility will limit capacity to 50 percent of its maximum capacity. Kiwanis Ice Arena in Saugerties, N.Y. — Photo courtesy of Kiwanis Ice Arena. In regards to sanitization, Kiwanis said that “frequently touched surfaces will be sanitized using an approved cleaning product registered in New York State and identified by the EPA as effective against COVID-19.” This will be done before each day starts and before each scheduled group arrives. Bear Mountain Ice Rink is also limiting capacity to 50 percent, and they are an outdoor venue. Located in Tompkins Cove in Rockland County, they are requiring that face coverings are worn at all times, even while social distancing. According to their website, they are also prohibiting indoor seating and locker use. “Please bring a bag for your belongings,” they said. Nicole Ranaldi takes a spill on the ice at Bear Mountain Ice Rink in 2019. — Photo courtesy of Nicole Ranaldi. Even larger mountains have plans to ensure their patrons have a fun, yet safe, winter season. Hunter Mountain in Hunter put a reservation system in place for the season. “To allow for physical distancing, we are managing access to our mountains through a reservation system that prioritizes pass holders and by limiting lift ticket sales,” their official website said. “We’ve designed an approach that can remain in place all winter so that you know what to expect and you’re not caught off guard.” Hunter Mountain is also requiring face coverings along with pre-arrival health screenings, implementing cashless transactions, and limiting class sizes in regards to lessons. They are providing hand sanitizing stations throughout the resort. They have a similar protocol to Victor Constant when it comes to chair lifts. People can ride the lifts solo or with the people they arrive with, only. On larger lifts, they will separate two singles on opposite ends. Even the winter hunting season could be affected. The season can range from late-September through late-January. With cases continuing to increase, a second shutdown is looming. “If a second shutdown happens, it will affect hunting,” said Montgomery native Michaela Gironda. “Hunting has become a huge part of people’s lives in regards to being a source of food. If a second shutdown happens, it will cause more people to hunt and there will be a lesser supply of animals.” Jaden Breyette of Washingtonville out on a hunt. Photo courtesy of Michaela Gironda. Gironda also went on to say that the pandemic affected how she hunts. She went on to say that she can no longer hunt with larger groups of people like she used to and that the regulations on hunting may have changed. “I’m not sure on that, though,” said Gironda.
https://medium.com/thegroundhog/winter-sports-expectations-during-a-pandemic-19653f9aa44e
['Ryan Loeffler']
2020-11-30 17:19:12.072000+00:00
['Sports', 'Outdoors', 'Covid 19', 'Featured', 'Recreation']
Like a Heart
Like a Heart Gentle, Consistent, Kind-love. Photo by Sharon McCutcheon on Unsplash The gentleness with which The heart beats in its cage Almost like soft knocks Of a young lover boy Sneaking off to a tryst At twelve midnight … Reminds me that the most Important occurrence in life; Those with the most impact Happens with the littlest noise And seldom raises any dust! That life is tied to the continuity Of that often overlooked act; Of pumping by this heart Whether in rain or in shine Apparently makes clear That it’s small and often Albeit almost insignificant acts That are consistently repeated Overtime like the never- ending Tick tock of a clocks second hand That actually makes a big difference! That the seat of kindness Love and more, lives within This muscular lump of heart No larger than a clenched fist Is an attestation of the fact; That it’s not by prevalence or by size Most times its the duty and reach That inspires greatness and awe; Like a monarch in his domain Is held in the highest regards And adored by both friend and foe! (c) Lady Foxx 29th November, 2020.
https://medium.com/passive-asset/like-a-heart-4f0c4d227432
['Lady Foxx']
2020-11-29 05:46:32.931000+00:00
['Heart', 'Love', 'Poetry On Medium', 'Personal Development', 'Poetry']
Tips for using the Internet wisely
Photo by Ludovic Toinel on Unsplash Today, Who doesn’t know the name of the Internet ?. It is undeniable that everyone must already know the Internet, starting from small children, adults, to the elderly, they already know the internet. Everyone needs the Internet. We do have to keep up with the times, especially in a digital era like today. Now everything is completely online, from studying, shopping, even school exams are now online or internet-based. Not to mention the active trend on social media starting from Twitter, Facebook, Instagram, and many more. Of the many benefits of the internet, the internet also has a bad impact on every user. Many sites are dangerous sites such as hoax news, porn sites, hate speech, fraud, and many more. To avoid this negative impact, I will provide tips for you on how to use the internet wisely : Change the mindset, use the Internet to learn positive things Photo by Viktor Forgacs on Unsplash One thing that is certain for you to apply is to change your own mindset about the internet, sometimes many people think the internet is a great technology, it can be used anything without knowing that there are many negative impacts from the development of the internet. You try to change your mindset where you have to emphasize to yourself that the internet is to add insight, so you don’t need to access the internet for things that are not.
https://medium.com/@rifantkj/tips-for-using-the-internet-wisely-b27838dfe71b
["Achmad Rif'An"]
2020-12-24 04:30:39.252000+00:00
['Gadgets', 'Health', 'News', 'Internet', 'Family']
Midnight Musings
Midnight Musings When old feelings surface. Photo by Romain Lours on Unsplash Sometimes I miss you, all that you are, that we were; and the tears come easy. Sometimes I wonder what it is we would have been, what could have been. I wonder if you share my idle reminiscing, if you sit in silence in memory of our time together, if you wonder what could have become of our fairy-tale romance. If you cry in secret and clad your broken heart with smiles and laughter. I know I gave my all, even pieces I shouldn’t have given; I was drowning in forever symphonies and teenage pleasures. I loved you with all of me, and I loved you beyond reason; frankly a piece of me has surely never stopped loving you, a piece that still longs for you even as I key in these words. In a way, you broke me and stole a part of my soul; I am never complete without you, not really. Some part of me still hopes, but reason and time have curbed its excesses and charted a course for my heart; away from you. There wasn’t a thing I couldn’t do for you, no length I couldn’t reach to hold you down. Every minute with another was a dress rehearsal; I saw you in every smile, tasted you on every lip, felt your warmth in every caress. I was addicted to loving you and afraid to admit that forever was just a dream; with you at least. I wept for you several violent acid flows from my very heart. You reached to the well of emotions within my being that burst springs of bitter-sweet sentiments, flooding my soul, and staining sheets upon sheets of paper. I was strung up on the possibilities of us, so much so that the shattering of my heart and dignity before my very eyes was a sight I could bear to witness.
https://medium.com/literally-literary/midnight-musings-cf77c121bac5
["Ngang God'Swill N."]
2020-07-27 05:06:06.183000+00:00
['Feelings', 'Love', 'Pain', 'Poetry', 'Moving On']
BTCCREDIT SMART CONTRACT AUDIT REPORT
by QuillAudits, July 2019 Introduction : This Audit Report highlights the overall security of BtcCredit Smart Contract. With this report, we have tried to ensure the reliability of their smart contract by complete assessment of their system’s architecture and the smart contract codebase. Auditing Approach and Methodologies applied : Quillhash team has performed thorough testing of the project starting with analysing the code design patterns in which we reviewed the smart contract architecture to ensure it is structured and safe use of third party smart contracts and libraries. Our team then performed a formal line by line inspection of the Smart Contract in order to find any potential issues like race conditions, transaction-ordering dependence, timestamp dependence, and denial of service attacks. In the Unit testing Phase we coded/conducted Custom unit tests written for each function in the contract to verify that each function works as expected. In Automated Testing, We tested the Smart Contract with our in-house developed tools to identify vulnerabilities and security flaws. The code was tested in collaboration of our multiple team members and this included - Testing the functionality of the Smart Contract to determine proper logic has been followed throughout. Analyzing the complexity of the code by thorough, manual review of the code, line-by-line. Deploying the code on testnet using multiple clients to run live tests Analyzing failure preparations to check how the Smart Contract performs in case of bugs and vulnerabilities. Checking whether all the libraries used in the code are on the latest version. Analyzing the security of the on-chain data. Audit Details Project Name: BTCCREDIT website/Etherscan Code : Code Languages: Solidity (Smart contract), Javascript (Unit Testing) Summary of BTCCREDIT Smart Contract : QuillAudits conducted a security audit of a smart contract of BTCcredit. BTCcredit contract is used to create the ERC20 utility token which is a BTCCredit token, Smart contract contain basic functionalities of ERC20 token with total supply of 300 million and Mint function to mint more Btc credit tokens . Audit Goals The focus of the audit was to verify that the smart contract system is secure, resilient and working according to its specifications. The audit activities can be grouped in the following three categories: Security: Identifying security related issues within each contract and within the system of contracts. Sound Architecture: Evaluation of the architecture of this system through the lens of established smart contract best practices and general software best practices. Code Correctness and Quality: A full review of the contract source code. The primary areas of focus include: Correctness Readability Sections of code with high complexity Quantity and quality of test coverage Security Level references : Every issue in this report was assigned a severity level from the following: High severity issues will bring problems and should be fixed. Medium severity issues could potentially bring problems and should eventually be fixed. Low severity issues are minor details and warnings that can remain unfixed but would be better fixed at some point in the future. High severity issues:- 1 distributeTokens(address _address, uint _amount); distributeTokens function of smart contract only check whether total supply is less than or equal to distributed tokens, but doesn’t subtract from minted tokens, as minted tokens is already sent to particular address, hence owner will be able to distribute tokens more then total supply, as shown in the pictures below. Total supply is : 300001000 0x410 is holding : 300002000 You can use one variable that will automatically update when mint functions is called and will update the no of tokens minted, while distributing tokens check whether distributed tokens is less than the total supply subtracted from minted tokens. Status : Issues Fixed by Developer Medium Severity Issues:- 1 Use transfer event while burning tokens, as etherscan reads transfer events. As now burn doesn’t emit transfer function now, etherscan couldn’t be able to read balance of an account that has burned all the tokens and still showing previous balance of that address. In both burn() and burnFrom() function of token contract. Status : Issues Fixed by Developer Low Severity Issues:- 1 Solidity version must be fixed (Always use latest Version). It should not pragma solidity ^0.4.25; It should be pragma solidity 0.4.25; Status : Issues Fixed by Developer version should be fixed so that development phase and deployment phase should have the same solidity version. 2 SafeMath.sol contract functions access modifier can be changed to internal, as there is no use to make them public functions. Use : access modifier internal in place of public. Status : Issues Fixed by Developer Issues while checking initial audit fixes Medium Severity Issues:- As you can see in the above picture, after minting 300 million tokens total supply was 600 million and, when 1 million tokens are burnt from account 1, transfer event was omitted from contract to address(0), but it should be from address (msg.sender/caller) to address(0) that’s why percentage of token holder is higher than 100% (total supply). Status : Not Fixed Unit Testing Test Suite Contract: BTCcredit Token Contracts ✓ Should correctly initialize constructor values of BTCCToken Token Contract (250ms) ✓ Should check the Total Supply of BTCCToken Tokens (80ms) ✓ Should check the Name of a token of BTCC Token contract (79ms) ✓ Should check the symbol of a token of BTCCToken contract (64ms) ✓ Should check the decimal of a token of BTCCToken contract (74ms) ✓ Should check the balance of an Owner (66ms) ✓ Should check the owner of a contract (57ms) ✓ Should check the New owner of a contract (57ms) ✓ Should check the balance of a contract (64ms) ✓ Should check the distributed tokens of a contract (68ms) ✓ Should check the owner account is not frozen (66ms) ✓ Should check the minted tokens (52ms) ✓ Should Not be able to distribute tokens to accounts[1] by Non Owner account (154ms) ✓ Should Not be able to distribute tokens more then total supply (170ms) ✓ Should be able to distribute tokens to accounts[1] (257ms) ✓ Should check the distributed tokens of a contract after sent to account[1] (62ms) ✓ Should Not be able to Freeze Accounts[1] by non owner account (263ms) ✓ Should Freeze Accounts[1] (210ms) ✓ Should Not be able to transfer Tokens when account is freezed (97ms) ✓ Should Not be able to UnFreeze Accounts[1] by non owner account (94ms) ✓ Should UnFreeze Accounts[1] (201ms) ✓ Should be able to transfer Tokens after Unfreeze (340ms) ✓ Should Not be able to burn tokens when account doesn’t have tokens (96ms) ✓ Should be able to burn tokens (217ms) ✓ Should be able to emit transfer event while burn tokens (217ms) ✓ Should check the Total Supply of BTCCToken Tokens after token burnt ✓ Should Not be able to Mint Tokens by Non owner account (96ms) ✓ Should be able to Mint Tokens by owner only and send to accounts[3] (210ms) ✓ Should check the minted tokens after minted (44ms) ✓ Should Not be able to distribute tokens to accounts[6] less then total supply and minted tokens (127ms) ✓ Should be able to distribute tokens to accounts[6] less then total supply and minted tokens (192ms) ✓ Should check the Total Supply of BTCCToken Tokens after tokens minted (59ms) ✓ Should be able to transfer ownership of token Contract (160ms) ✓ Should check the New owner of a contract ✓ Should be able to Accept ownership of token Contract (124ms) ✓ Should check the New owner of a contract after ownership accepted ✓ should Approve address to spend specific token (155ms) ✓ Should be able to transfer Tokens approved by account[3] to account[5] (299ms) ✓ should check the allowance later ✓ Should be able to burn Tokens approved by account[3] to account[5] (162ms) ✓ Should be able to emit transfer event while burnFrom tokens (217ms) ✓ Should check the Total Supply of BTCCToken Tokens after tokens burn ✓ Should correctly initialize constructor values of sample Token Contract (110ms) ✓ Should check the balance of a sample contract Owner ✓ Should be able to transfer sample tokens to BTCCToken contract (219ms) ✓ Should be able to transfer sample tokens to owner of BTCCToken, sent to contract of BTCCToken (225ms) ✓ Should Not be able to send ethers to contract Contract: BTCCToken Token Contract for final audit testing at maximum values ✓ Should correctly initialize constructor values of BTCCToken Token Contract (99ms) ✓ Should check the Total Supply of BTCCToken Tokens (39ms) ✓ Should check the Name of a token of BTCC Token contract ✓ Should check the symbol of a token of BTCCToken contract ✓ Should check the decimal of a token of BTCCToken contract ✓ Should check the balance of a Owner ✓ Should check the owner of a contract ✓ Should check the New owner of a contract ✓ Should check the balance of a contract (57ms) ✓ Should check the distributed tokens of a contract ✓ Should check the owner account is not frozen ✓ Should check the minted tokens (38ms) ✓ Should be able to Mint Tokens by owner only more then total supply (128ms) ✓ Should check the minted tokens after minted ✓ Should check the Total Supply of BTCCToken Tokens after tokens minted ✓ Should be able to distribute tokens to accounts[1] (192ms) ✓ Should be able to self destruct (192ms) Final Result of Test: ✓ 64 Passing (8s) PASSED ❌ 0 Failed Manual Transactions Network : Rinkeby Contract Creation 0xe1293ac890cefe3ec24376ba736e3627c50357361b2b76e383827d06732b2fc6 Distribute tokens 0x9718970006fb25c2cd8060fcb995b150e407b9625c236e4b8f59b6b14c4ff366 Mint tokens 0xb31697d901e7933ba5636f1361a52a6df2242b34446ee36da61a3deb0b84d46c Fall back function (Should failed) 0x6e135a696036d6cbb0a8e5e6c6cafc066b4f59e62d5a9827ccb45ef44f2cd9d9 Distribute tokens more than token supply should be failed (1000 distributed first, 1000 minted, distributed 300mn + 1000 tokens) 0x7e51c9593df05d1d209cd3c4200d61b34924725f1fcd5a737da7d066dd3ea135 Freeze Account 0xee54d3bad0047483ead846b16548e2a9f819eb0dee80d380769a3bdaa889e63f Burn tokens(Failed when try to burn tokens that account have, because total supply is less then account holders token) 0xe73f81fe90892e1450c9294359e0de387bdb49df986aa432bd2654a22c0be396 Burn tokens (No transfer event when tokens burn) 0x03917d8c4562db959fb985713b6a0576d8eff63193843acbac686b62f368d05c Mint tokens 0xec6b306d78c8d68e11d5c95a10dffa044d8ccaa675ddb3f05b543add63e71faa Transfer function 0xaf3a913d6e59038897921c13b9d465cecb3d028ffb6d77b1ffb96ad87405ca9d Transfer Ownership 0xf8cc6713f3397f8e66b236ac57d7608d3ab4a1ddba0622915153c74f8beac2ef Accept ownership by non New owner account (Failed) 0x3f66afc8c0e2493a165e0560a238205cfc0a8fce3ff4525d577ee2632a57b5c6 Accept ownership 0xd65d67d585fe3485f969783529c8f0f6bef0ebfc2e42eef2c01c9df89328d457 Approve 0x497b1f7cc1a6e037dc222d58575df42f8974ad76121b0818ee2d1d4c0d7f0718 transferFrom fail by frozen account 0x7ae353f779ad671891f8095241d9488e521870ea7b4cf9eae228fa056f398dac BurnFrom 0xe8f058f6d95aa96049612b88d4354f5e88188a10cf90546341422a328981a379 Fallback function 0x8758fbcf020eba66173fd160a90d4831ff8b4f6e4ce4eba3155be092adb848fd Unfreezed account 0xfa1ff4f82112aab86419a91408ab51a2972772ba2b2fc7e392b57bce1cdc1868 Minted 300 min tokens to contract address 0xfcf0cf0b7173067d607f3f39c22dab64fe5d5301aa6c4bd4356d005314795726 Self destruct by non owner account 0xc63b634bb2a3041dd9ab9c6404b7ba2d4e4e928cfffe262d8bfe9c165c903f8a Self destruct by owner account 0x0aa1a1a76372ec4549ab96c8d8d393be04308c99dbb0252d3b9c03fc9adb4479 Tool Result : Above functions can be declared as an external functions Some of the variables are not mixed case variables Smart contract Description Coverage report Token A is a sample token contract to check transferAnyERC20Token() function of BTCCtoken contract function. Implementation Recommendations : _totalSupply variable can be declared as private variable, as totalSupply() function already using _totalSupply variable. Use SafeMath while subtracting totalSupply with balance[address(0)]. transferFrom() function should check allowance first. Use transfer event while initialising tokens in constructor, transfer event from address(0) to contract address and also update balance of contract. While distributing tokens use transfer event from contract address to beneficiary to maintain all the balance events on etherscan. Comments: Use case of smart contract is very well designed and Implemented. Overall, the code is clearly written, and demonstrates effective use of abstraction, separation of concerns, and modularity. BTCCREDIT development team demonstrated high technical capabilities, both in the design of the architecture and in the implementation. All the severity issues has been fixed by BTCCREDIT team and we request BTCcredit team to go through recommendations as well to maintain standardization of code. Note : Smart contract code contain self destruct function which can be called by owner only. Thanks for reading. Also do check out our earlier blog posts. QuillAudits is a secure smart contract audits platform designed by QuillHash Technologies. It is a fully automated platform to verify smart contracts to check for security vulnerabilities through it’s superior manual review and automated tools. We conduct both smart contract audits and penetration tests to find potential security vulnerabilities which might harm the platform’s integrity. To be up to date with our work, Join Our Community :- Telegram | Twitter | Facebook | LinkedIn
https://medium.com/quillhash/btccredit-smart-contract-audit-report-c9464deebeb0
['Abhishek Sharma']
2019-07-22 07:33:25.806000+00:00
['Blockchain', 'Smart Contract Auditing', 'Audit', 'Smart Contracts', 'Ethereum']
Ethereum Zero to Hero: Introduction
2017 has arguably been the year of Cryptocurrency, with Bitcoin being getting most of the spotlight; at the core of Bitcoin, we have the Blockchain. Blockchain technology applications go way beyond digital currency, and one of the best examples is Ethereum which is a decentralized platform that runs smart contracts. This allows developers to build enormously powerful decentralized applications, at this point there is still a lot of active development and innovation happening both around blockchain and ethereum. The downside of all this constant innovation and development is that tutorials, documentation, and resources go out of date quickly, this has made it difficult for developers like me (or you) to get a solid footing when getting started. This guide is not to mean to the end all or be all, rather a quick introduction that can get you started quickly and hopefully agnostic enough that won’t go out of date too fast. With that said, let’s get started by reviewing some core concepts: Smart Contracts Contracts live on the blockchain in an Ethereum-specific binary format (EVM bytecode). A smart contract is a piece of software that resides on the Ethereum Blockchain. Like traditional contracts, smart contacts not only define the rules and penalties around an agreement but additionally the enforce those obligations. Ethereum Virtual Machine At the heart of it is the Ethereum Virtual Machine (“EVM”), which can execute code of arbitrary algorithmic complexity. In computer science terms, Ethereum is “Turing complete.” This is the core and primary innovation behind the Ethereum project. Each participant of the Ethereum networks runs an instance of the virtual machine, and its purpose is to execute the smart contracts in a completely isolated environment, meaning no access to Network, Filesystem or other processes. Gas Gas is a concept unique to the ethereum platform and is way to limit the resources available to a given smart contract. For every instruction executed in the EVM, there is a fixed Gas cost associated with it. Solidity Solidity is a contact-oriented, high-level language for implementing smart contracts. The syntax resembles javascript and is influenced by languages like C++ and Python, and it compiles directly to EVM assembly. Blockchain “The blockchain is an incorruptible digital ledger of economic transactions that can be programmed to record not just financial transactions but virtually everything of value.” — Don & Alex Tapscott, authors Blockchain Revolution (2016) The best way to think about the blockchain is as decentralized immutable database or ledger, that can permanently store any type data. The potential business applications for this technology are still being discovered and experimented with, but there are tons of examples online, to mention a few we have: Crowdfunding Governance File Storage Protection of Intellectual Property Identity Management Property Registration In the next entry in this series we will set up a local development environment for creating our first smart contract. This article was originally posted on my own site.
https://medium.com/hackernoon/ethereum-zero-to-hero-introduction-4a2930d636a
['Allan Macgregor']
2017-12-25 14:29:28.974000+00:00
['Blockchain', 'Cryptocurrency', 'Programming', 'Ethereum']
You Are Not ‘Working From Home’
You Are Not ‘Working From Home’ You’re at home, during a crisis, trying to work Photo: Johner Images/Getty Images Advice about working from home has been plentiful lately, including from us here at Forge: Establish a routine, organize your desk, take breaks, wear pants. But let’s make one thing clear: At this moment, you are not just “working from home.” You are “at your home, during a crisis, trying to work.” That’s how the bosses at the Canadian federal agency Parks Canada put it recently, expanding a tweet that went viral in March with a set of principles for working remotely under Covid-19. Those six guidelines were shared by an employee at an adjacent department of the government in a now-viral tweet. The principles highlighted something millions of people around the world are feeling: Everything is hard right now. It’s hard for the legion of working parents balancing jobs with childcare, homeschooling, and the ongoing demands of running a household. It’s hard for single people, alone for weeks or months on end. It’s particularly hard for women, who disproportionately shoulder the burden at home and at work. The Canadian government’s guidelines are an acknowledgment of several fundamental and self-explanatory truths about life right now: None of us is working at 100% or being our best self. Each of us is dealing with a different set of challenges, and our ability to “cope” varies widely. Now is not the time to insist upon business as usual. But while many companies and managers are trying to tend to their workers’ well-being, basic respect for humans over productivity — laid out so forcefully in the Canadian memo — is far from standard, as many of the replies to the tweet indicated. And at a time when layoffs and furloughs are everywhere we look, many who are lucky enough to have jobs are feeling anxious about keeping them, and doing all they can to prove their worth, sometimes overworking and burning out in the process. It’s a coping mechanism that capitalism and the cult of efficiency taught us long before the pandemic struck. The pressure to “keep it together” at work, at all costs, is nothing new. And of course, it’s reassuring to pretend that if we only hustle hard enough, everything will work out for us. That attitude has come to define modern workplace culture: Produce more, achieve more, innovate harder, and you won’t get left behind. It’s not treading water; it’s passion for a purpose. Despite our tremendous ability to adapt and grow as a species, we’re not immune to the conditions of the world around us. We never have been. If there’s any comfort to be had from a global catastrophe, it’s that it has shown the cracks in the facade of our status quo. We can try and achieve “peak performance,” but the fundamental processes and rhythms of our bodies, minds, and relationships keep getting in the way. We are only human. “Normal” was never normal, and once this is over, we shouldn’t forget that.
https://forge.medium.com/you-are-not-working-from-home-429ff71c7f2b
['Kelli María Korducki']
2020-05-15 11:50:31.572000+00:00
['Work Life Balance', 'Coronavirus', 'Succeed', 'Work', 'Working From Home']
Parts of the Brain that Predict the Future
Many parts of the brain work together to predict the future based of off events that have already occured. We mainly anticipate events and predict things with two significant parts of the brain. A basic way your brain predicts the future is when a popular song comes on and your brain knows the tune and you bob your head to the tune. The brain works together to infer the future and interpret the past. Many scientists believe that the brain is actually split between the parts that collect the information from the past and the parts that predict the future. The two sections of the brain that predict the work to predict the future are the basal ganglia and the cerebellum. A rhythm-based system is the thing in your head that makes these predictions. When your brain makes a prediction it is basically just a giant coincidence that contains timing. The brain uses past events to create a future image for the timing of the brains ideas. A lot of people that make predictions for the future are also very open minded and able to change their opinion based off of new presented information. A good prediction comes from interpreting all events and really scanning the situation. Making predictions is not a talent, it is a skill that many people have learned to used throughout the years. Anyone can learn how to predict the events that will happen in the future based off of the timing of the event.
https://medium.com/@217ere15/parts-of-the-brain-that-predict-the-future-18c52ca7ff1a
['Emily Etzig']
2019-11-15 14:35:03.227000+00:00
['Brain', 'Predictions', 'Science']
How To Beat the Monday Blues
11 strategies to help you start to enjoy your Mondays! It’s typically called the Monday Blues. Monday mornings are one of my best moments of the week. It’s the beginning of a new week. It’s time to get up early once more and join the rat race. People, expectations, and obligations abound. Ugh! It’s enough to make you want to hide under your blankets until Friday. Mondays are my favorite days because they allow me to start new goals. Monday is the perfect day to start over. If the previous week was not as gratifying and goals were not met, Monday is the perfect day to pick yourself up and begin again. Even waking up on a Monday morning is a gift. You’ve been offered a second chance at life. You’ve got another twenty-four hours to get things done. You have a second chance to make things happen. So don’t be concerned about Mondays. Let us look forward to Monday as a fresh start. Here are 11 ideas to help you beat (or avoid) the Monday blues: 1. Determine the issue The first step is to figure out what’s wrong. If you get the Monday Blues regularly, you should not laugh it off or accept it as a part of life. It’s a clear indication that you’re unhappy at work and need to either repair it or find another job. Make a list of the things that are causing you to struggle at work. Perhaps it’s a hostile coworker or a Monday morning conference with your manager, or maybe it’s a lack of challenge — or maybe it’s all of the above. In either instance, identifying what is hurting you might assist you in attempting to be proactive in seeking solutions. It’s a means of empowering you to take care of the issue and try to fix it. If you only have a slight case of the Monday Blues, you can do a few things to brighten yourself and others on a rather gloomy Monday effectively. 2. On Friday, get ready for Monday Mondays can be particularly stressful because of the work that has stacked up over the prior week, and it can be difficult for many to jump right back in. I recommend leaving as few terrible jobs as possible on Friday afternoon to help battle Monday morning nervousness. By taking care of the things you don’t want to deal with after one workweek, you’re improving the beginning of the next. If you have any menial tasks on Monday morning, complete them as soon as possible, so you don’t waste the balance of the day postponing or feeling like there’s a black cloud hanging over your head. Make that awkward phone call, fix that unresolved issue, or clean up the mess that’s been waiting for you. Once it’s over, you’ll feel a lot better. It is imperative to make sure your calendar is up-to-date and synched and that you have a solid picture of and hold on to your forthcoming work week — especially Monday. What do you need to get ready for and organize? Get it done by Friday, if feasible, otherwise by Sunday. 3. Create a list of the things that make you happy When we look at the week ahead, we frequently think about all the challenging things we have to accomplish and the arduous duties that lie ahead of us. Switch that around. Make a list of three things you’re looking forward to at work this week on Sunday evening. It may put you in a more upbeat frame of mind. If you can’t think of positive things you’re looking forward to, you need to make some adjustments. 4. Turn off all electronics for the weekend If possible, avoid reading work e-mail or voicemail over the weekend, especially if you won’t be able to react until Monday. It’s tempting to know what’s waiting for you, but setting firm boundaries between work and leisure time might help you stay on track. Leave your office worries at the office door on Friday and concentrate on enjoying your time off. Going back to work on Monday can be particularly aggravating if you’ve allowed it to infiltrate your free time to the point where you don’t feel like you’ve had a weekend at all. 5. Get plenty of rest and get up early I recommend going to bed a little earlier on Sunday night and getting enough sleep so that you wake up feeling refreshed. It’s unlikely that you’ll feel comfortable going anywhere when the alarm goes off on Monday morning if you’ve only had a couple of hours of sleep. While it may seem counterproductive, getting up 15 to 30 minutes earlier on Monday morning can make returning to work much more manageable. Having a bit more “me time” rather than feeling pressed for time can make that shift a little easier. Taking the time to have a good breakfast, workout, or walk the dog might help you feel more centered for the rest of the day and remind you that you aren’t a machine that only sleeps and works. 6. Dress to impress Dress up, perk up, and turn up ready to be cheerful and encourage others to be positive. Be the brightness and vitality that brightens people’s days. Make yourself magnetic by displaying and sharing your spirit, charisma, and vibe. One piece of advice: wear your favorite new clothes on Monday. It can help you gain confidence at work and may even earn you a few compliments from coworkers. Whenever you look great, you feel fabulous. On Monday mornings, feeling good about yourself is half the battle because instead of getting depressed by work, you want to approach it with determination. 7. Maintain an attitude of gratitude Begin the week with an attitude of thankfulness. Take time to acknowledge and appreciate the aspects of your job that you enjoy. It begins even before you start working. Listen to your favorite music to get pumped up on your way to work. Consider the type of exercise playlist you’d make, and include that same lively, high-energy music into your morning routine or commute. Do your utmost not to whine when you get to work, and keep your Monday morning crankiness to yourself. In a similar spirit, don’t listen to other people’s Monday complaints. It’s impossible to fix your mindset by creating or participating in a complaint culture. You must decide to transform negative apprehension and trepidation into a good, productive, and happy welcome to Monday attitude. Start on Friday by organizing your desk and making sure your work to-do list is ready next week. Relax, assess, and reward yourself on Sunday, but the schedule for and prepare for Monday. If you can be a source of happiness in the office, you’ll not only make your day more enjoyable, but you’ll also improve the work environment for people around you. 8. Make another person happy. Make yourself the promise to do something good for someone else as soon as you get to work on Monday. Doing pleasant things for others may improve one’s spirits, and in this situation, it may even shift the entire vibe in your office. Paying it forward can have a positive impact on everyone. Positive psychology studies show that one of the best ways to feel better is to make someone else happy. You may complement a colleague, do something nice for a client, provide assistance to a stranger on the street, or find another method to brighten someone else’s day. 9. Hold Monday’s plan as light as possible Given that Mondays are generally hectic days in the office, keeping your Monday plan as straightforward as possible is a wise tactic. Try to schedule meetings on Tuesdays and Wednesdays when preparing ahead. It will make it easier for you to transition from the weekend to Monday. Rather than addressing the most complex and time-consuming activities first thing Monday, devote some time to other, more usual duties. It might get you going and give you the energy you need for the more arduous chores. But please be careful: if you have too much free time, you’ll end yourself “feeling blue.” 10. Have good time at work On Monday, take it upon yourself to do activities you enjoy at work. Perhaps bring pastries for your coworkers or take a little break to meet up with a coworker. Sharing weekend experiences with coworkers may be entertaining as well as a fantastic method to improve your interoffice network. Arrange a weekly Monday morning coffee or lunch with a friend. Create a Monday event you’ll look forward to as a way to break up the day with some well-known optimism. At the very least, it allows you to take a big breath, talk with a friend, and refocus for the remainder of the day. 11. Make a post-work schedule. Your day should be about more than just getting through Monday. It should be regarding looking forward to something. By making Monday a memorable day when you get to hang out with friends, cook your favorite supper, or eat a bucket of popcorn while watching a TV show you taped, the day doesn’t have to be all about waking up to go to work. And you, do you have any tips and tricks to beat to Monday blue?
https://medium.com/@helena_calderon/how-to-beat-the-monday-blue-707ef7d74b31
['Helena Calderon']
2021-12-14 14:36:20.237000+00:00
['Motivational', 'Work Life Balance', 'Tips And Tricks', 'Monday Motivation', 'Mondays']
2018 Politics — Can America Start Listening Again?
In 2018, politics in America is totally polarized, and views are entirely contained in both conservative and liberal echo chambers without the allowance for opposing dialogue or debate. From the right, if you disagree with their opinions then you are immediately labeled a ‘snowflake’ or a ‘libtard,’ and from the left, you are immediately branded a ‘fascist’ or ‘racist’ without fail. I have slowly removed myself from the political debate because it is no longer a dialogue of opposing views trying to find common ground to make everything better as a whole, but it is a cesspool of namecalling and hatemongering in the name of that group’s views. I have asked myself on several occasions; what would it take to get people from both sides to turn off the noise of hate and anger and turn on understanding and empathy at the other side’s opinions? The answer has to be real communication and conversation that will never occur in the 140 character soundbite world of Twitter or the 15-second soundbite of the news media. But, a real discussion between both sides will only happen when they meet to talk face to face and agree to do so without the angst and anger that so often pervades today’s politics. Can America come together and really start communicating? Or, as a Nation, have we gone too far down the path of quick to anger and ready to block social media echo chambers where only our opinion matters and that is the only opinion we want to hear? Before I answer those two questions, I want to look at the many things that have changed during my lifetime as they pertain to the world we live and the way we communicate. In 1970, there were only three national broadcast networks for news, CBS, NBC, and ABC. Many local and national newspapers shared the print news twice daily, morning and afternoon. Our access to information was limited to those sources, and they seemed to be reasonably reliable sources with Walter Cronkite being the most-trusted anchormen in the country during that time. The nation believed he was giving us the facts and only facts as he knew them without slanting them towards one side of the political spectrum or the other. But, the late sixties and early seventies were also a time of great political upheaval with the protests against the Vietnam War breaking out on college campuses and the scandal of Watergate threatening to tear the fabric of the nation as the President’s integrity was called into question. Did these events divide the country along political boundary lines? Yes, for a time, but without the 24-hour social media and instant information of the Internet; echo chambers were not constructed, and the two sides were forced to either have face-to-face debate, a snail-mail letter exchange, or telephone conversations to communicate their views. With the skill of critical listening still being practiced by most Americans at that time, the other side’s logical argument could be understood, and both sides could come to an understanding whether they agreed or not; their stance was respectfully received. In 2018, do people have the critical listening ability that previous generations practiced so well? No, listening to the other person is a lost art because we want only our opinion heard and we do not care about the other person’s opinion or their point of view because, in our own tunnel-visioned world, only our stance matters. In today’s echo chamber of not only politics but life, just MY opinion matters and if you don’t agree with MY opinion, I will shout you down with hate-filled rhetoric or flame you on social media with thirty-seven memes that shout my opinion in an attempt to ultimately drown out any opposition. If that does not work, we resort to blocking that individual we do not agree with instead of actually taking the time to critically listen and understand their point of view. To answer the questions I posed, I think the nation as a whole has gone too far down the path of instant gratification and instant echoing of our own views that we are mentally unable to critically listen and have an honest, logical, and rhetoric-free discussion. To break the cycle for myself, the next time I find myself in a political debate I will make an effort to listen more critically and attempt to understand the other person’s point of view. And, if I disagree with them, I would hope that person would provide me the same courtesy because only by listening and attempting to understand will we break the echo-chamber cycle that is 2018 politics and interpersonal relationships.
https://stancromlishauthor.medium.com/2018-politics-can-america-start-listening-again-b0260616c209
['Stan Cromlish']
2018-08-04 21:22:06.654000+00:00
['Politics', 'Social Listening', 'Psychology', 'Mental Health', 'Debate']
Maintainability — Logs & Their Use
Maintenance 102 Maintainability — Logs & Their Use An application that logs properly will make maintenance easier. In the first chapter, we saw how to handle exceptions and how to log them to: not leak sensitive data to the user. have user user-friendly messages displayed for your users. have all the details for easy troubleshooting. In this second article, we will see what needs to be logged beside exceptions, how to log them, and why we are logging them. The examples in the article are in PHP, using the PSR-3: Logger Interface; but the logging concepts we will discuss are common to all languages. What to Log & How to Log To decide what needs to be logged, we must first decide why we are logging the information. Most often we need to log information to make diagnostic easier . For example, if an API returns an error we will log the request & response. . For example, if an API returns an error we will log the request & response. We also log to for debug purposes , For example, we might log all api requests & responses in order the debug the api endpoint. , For example, we might log all api requests & responses in order the debug the api endpoint. We can also log for statistics; this can be used for various reasons. Logging for diagnostic The most important log for diagnostic is the exception logs; those we have already discussed previously. But we can also log other “errors” for all errors are not exceptions. One of the best examples of this is API calls; there are other cases as well; a missing file; an empty database query…. The cases depend upon your application. Let’s see an example with an api call <?php $response = $this->httpClient->get($endpoint, $query); if ($response->getStatusCode() != 200) { $this->logger->warning( "Failed to fetch test data from test api", [ "endpoint" => $endpoint, "query" => $query, "status_code" => $response->getStatusCode(), "body" => $response->getBody() ] ); throw new MyApiException("Failed to fetch data from test api"); } In this example, we logged everything we need to make the same api call locally to check what is going on. We also used a unique log message so that if we find the log message in our logs we can find the code that has logged it. So the point here is to : Log what you need to reproduce the error locally and you can’t get by any other means. Log what you believe will allow a quick diagnostic (The body) Log a unique message that has meaning. Do not log every step of the code! You need to keep in mind not to log sensitive data such as passwords. Logging for Debug Logging for debugging is very similar to logs for diagnostic. If you have api’s that are very sensitive and you can’t make api calls locally easily(payments for example). Or you do not trust the api. It can be a good idea to log everything. Let’s add debugging data to our previous example; <?php $this->logger->debug( "Fetching test data from test api", [ "endpoint" => $endpoint, "query" => $query, ] ); $response = $this->httpClient->get($endpoint, $query); if ($response->getStatusCode() != 200) { $this->logger->warning( "Failed to fetch test data from test api", [ "endpoint" => $endpoint, "query" => $query, "status_code" => $response->getStatusCode(), "body" => $response->getBody() ] ); throw new MyApiException("Failed to fetch data from test api"); } $this->logger->debug( "Fetched tes data from test api", [ "endpoint" => $endpoint, "query" => $query, "body" => $response->getBody() ] ); Here we have added 2 additional logs: The first log is before making the call to the API. We log all the information we have at that point to be able to execute the same call locally if possible, or comparing with working calls. The second log has the response, if we are working with payments this might be the only way to see a complete response. (Some payment methods have additional information on the production they don’t have in the sandbox). So the point here is to : Log what we do. Log the result. Like in the previous case you need to keep in mind not to log sensitive data such as passwords.
https://medium.com/swlh/maintainability-logs-their-use-6ac3bb235e31
['Oliver De Cramer']
2020-09-24 16:29:14.339000+00:00
['Coding', 'PHP', 'Monolog', 'Programming', 'Web Development']
Why do we make Qurbaani?
Nabi Ebrahim (‘alaihis salaam) was a very great and special Nabi — he was known as ‘Khaleelullah’ which means ‘the special friend of Allah Ta‘ala’. Nabi Ebrahim (‘alaihis salaam) always wanted to have children, but for years and years and years, he did not have any children. Finally, when Nabi Ebrahim (‘alaihis salaam) had waited for a long, long time, and he was now 86 years old, Allah Ta‘ala blessed him with a beautiful baby boy — Nabi Ismaa‘eel (‘alaihis salaam). A few years later, when Nabi Ismaa‘eel (‘alaihis salaam) was no longer a small baby, but was now a young boy, running around, playing and helping his father, Nabi Ebrahim (‘alaihis salaam) had a dream. In his dream, he saw that he took a knife, placed it on the throat of his beloved son, Nabi Ismaa‘eel (‘alaihis salaam) and slaughtered him! Nabi Ebrahim (‘alaihis salaam) knew that the dream of a Prophet is not just a dream — it is a message and a command from Allah Ta‘ala. So, through the dream, Nabi Ebrahim (‘alaihis salaam) knew that Allah Ta‘ala wanted him to slaughter his son. Nabi Ebrahim (‘alaihis salaam) was always ready and happy to do whatever Allah Ta‘ala told him, so he was ready to slaughter his son, even though he loved him so much. But, he wanted to see if his son was also ready to make Allah Ta‘ala happy and do what Allah Ta‘ala told them to do. Nabi Ebrahim (‘alaihis salaam) went to his son, Nabi Ismaa‘eel (‘alaihis salaam) and told him about the dream. When he heard the dream, Nabi Ismaa‘eel (‘alaihis salaam) immediately said, “O my father! If this is what Allah Ta‘ala wants you to do, then you must do it!” Nabi Ebrahim (‘alaihis salaam) and Nabi Ismaa‘eel (‘alaihis salaam) started walking together to the place where Nabi Ebrahim (‘alaihis salaam) would slaughter him. But as they were going, Shaitaan came to them! Evil Shaitaan did not want them to make Allah Ta‘ala happy, so he tried to stop Nabi Ebrahim (‘alaihis salaam) from going to slaughter Nabi Ismaa‘eel (‘alaihis salaam). Shaitaan tried three times, but every time, Nabi Ebrahim (‘alaihis salaam) took seven stones and threw them at shaitaan, making him run away. The places where he threw these seven stones are called the jamaraat, and when people go for hajj, they also throw seven stones at these places, to remember Nabi Ebrahim (‘alaihis salaam). Finally, when they reached the place of slaughter, Nabi Ismaa‘eel (‘alaihis salaam) lay down on the ground. Then, Nabi Ebrahim (‘alaihis salaam) picked up the knife and put it on his throat. He loved his son so much, but he had to listen to Allah Ta‘ala and do what Allah Ta‘ala said. Nabi Ebrahim (‘alaihis salaam) started to press the knife onto the throat of Nabi Ismaa‘eel (‘alaihis salaam), but Allah Ta‘ala stopped the knife from cutting him and he was safe! Then Allah Ta‘ala told Nabi Ebrahim (‘alaihis salaam) that He did not really want Nabi Ebrahim (‘alaihis salaam) to slaughter his son. Rather, Allah Ta‘ala wanted to test Nabi Ebrahim (‘alaihis salaam) and see if he loved Allah Ta‘ala more or he loved his son more. If he loved Allah Ta‘ala more, then he would listen to Allah Ta‘ala and slaughter his son. Nabi Ebrahim (‘alaihis salaam) passed the test and showed Allah Ta‘ala that he loved Allah Ta‘ala more than anything else in the world. Allah Ta‘ala was very happy with him and sent a sheep from Jannah to Nabi Ebrahim (‘alaihis salaam). Nabi Ebrahim (‘alaihis salaam) slaughtered the sheep, and that is why every year, Muslims make qurbaani and slaughter animals — so we can remember Nabi Ebrahim (‘alaihis salaam) and how he always listened to Allah Ta‘ala. Lessons We must always listen to Allah Ta‘ala and do what Allah Ta‘ala tells us to do. 2. We must love Allah Ta‘ala more than we love anything else. 3. We must not try and copy Nabi Ebrahim (‘alaihis salaam) and slaughter someone, because we are not the Nabi of Allah Ta‘ala. Knives are dangerous, so we must not play with knives.
https://medium.com/naseeha-channel/why-do-we-make-qurbaani-36bb42f8b1a6
['Tālib Al Ilm', 'An Epistemophile']
2020-07-21 17:25:05.584000+00:00
['Children', 'Tips', 'Muslim', 'Advice', 'Islam']
Good to know when you are learning about backend!
Good to know when you are learning about backend! Here you go! This is some (simplified) facts that is good to know when trying to understand backend! #API, #POSTMAN and #HTTP status codes. Photo by NASA on Unsplash API is the layer between the user and the data. You could look at it like a bank worker, trusted and trained :) it preforms logic and gives information. It’s also a service that listens, acts and responds. API routes: Incoming HTTP request is routed to a particular action method on a Web API controller. A common approach when creating routes is REST “Represontational state transfer”. Read more about RESTfull routes from the link here.. It could look like this. get/users post/users (create a user) get/users/123 (get specific user) put/users/123(this would be used to edit user) delete/users/123 (offcorse will delete this user) Postman is simply a testing tool when building backend and frontend. Postman let you test what to expect and plan how to use the data. HTTP status codes HTTP response status codes indicate whether a specific HTTP request has been successfully completed. From 200 and up everything is a successful response.. From 300 and up something was edited or changed. From 400 and up is used for client errors. From 500 Server errors. Lets look closer at status code 400. The 4xx class of status codes is intended for situations in which the error seems to have been caused by the client and the server receiving the request can’t understand it. If you get the 400 bad request message, perhaps you’ve mistyped a URL. For example, when you try to upload a file that’s too big to some sites, you might get a 400 error instead of an error letting you know about the maximum file size. Website designers can customize how a 400 error looks. So, you might see different looking 400 pages on different websites. Like this for example.. Dogs explaining the mysteries of HTTP status codes…. Thank You and have a lovely day!
https://medium.com/@emmieloheden/good-to-know-when-you-are-learning-about-backend-4a5ac205fb11
['Emmie Loheden']
2020-05-05 16:41:07.806000+00:00
['Postman', 'Http Request', 'Simple', 'Backend Development', 'API']
8 Skill yang Harus Dimiliki Mahasiswa Sebelum Masuk Dunia Kerja
Learn more. Medium is an open platform where 170 million readers come to find insightful and dynamic thinking. Here, expert and undiscovered voices alike dive into the heart of any topic and bring new ideas to the surface. Learn more Make Medium yours. Follow the writers, publications, and topics that matter to you, and you’ll see them on your homepage and in your inbox. Explore
https://medium.com/g%C3%B6%C3%B6p-kampus/8-skill-yang-harus-dimiliki-mahasiswa-sebelum-masuk-dunia-kerja-c4e055331f16
['Gööp Kampus']
2020-11-27 07:15:10.389000+00:00
['Mahasiswa', 'Indonesia', 'Lowongan Kerja', 'Skills', 'Wisuda']
Securing GitLab + Docker CI Pipelines
October 21, 2020 Intro Continuous integration (CI) jobs often require interaction with Docker, either for building Docker images and/or deploying Docker containers. One of the most popular DevOps tools for CI is GitLab, as it offers a complete suite of tools for the DevOps lifecycle. While GitLab is an excellent tool suite, it offers weak security when running CI jobs that require interaction with Docker. Now, you may ask, why do I need security for my CI jobs? Because the security weaknesses I describe allow CI jobs to perform root level operations on the machine where the job executes, thus compromising the stability of the CI infrastructure and possibly beyond. This article explains these security issues and shows how the new open-source Sysbox container runtime, developed by Nestybox, can be used to harden the security of these CI jobs while at the same time empowering users to create powerful CI configurations with Docker. TL;DR The article is a bit long as the first half gives a detailed explanation of the security related problems for GitLab jobs that require interaction with Docker. If you understand these problems already, you may want to jump directly to section Solution: Using Docker + Sysbox. Contents Security Problems with GitLab + Docker It is common for CI jobs to require interaction with Docker, often to build container images and/or to deploy containers. These jobs are typically composed of steps executing Docker commands such as docker build , docker push , or docker run . In GitLab, CI jobs are executed by the “GitLab Runner”, an agent that installs on a host machine and executes jobs as directed by the GitLab server (which normally runs in a separate host). The GitLab runner supports multiple “executors”, each of which represents a different environment for running the jobs. For CI jobs that interact with Docker, GitLab recommends one of the following executor types: The Shell Executor The Docker Executor Both of these however suffer from weak security for jobs that interact with Docker, meaning that such jobs can easily gain root level access to the machine where the job is executing, as explained below. Security issues with the Shell Executor When using the shell executor, the CI job is composed of shell commands executed in the same context as the GitLab runner. The diagram below shows the context in which the job executes: A sample .gitlab-ci.yaml looks like this: build_image: script: - docker build -t my-docker-image . - docker run my-docker-image /script/to/run/tests The shell executor is powerful due to the flexibility of the shell, but it’s unsecure for Docker jobs: it requires the GitLab runner be added to the docker group, which in essence grants root level access to the job on the runner machine. For example, the CI job could easily take over the runner machine by executing a command such as docker run --privileged -v /:/mnt alpine <some-cmd> . In such a container, the job will have unfettered root level access to the entire filesystem of the runner machine via the /mnt directory in the container. The shell executor also suffers from a couple of other functional problems: 1) The job executes within the GitLab runner’s host environment, which may or may not be clean (e.g., depending on the state left by prior jobs). 2) Any job dependencies must be pre-installed into the runner machine a priori. For these reasons, developers often prefer the GitLab Docker Executor, but unfortunately it’s also unsecure (see next section). Security issues with the Docker Executor When using the Docker executor, the CI job runs within one or more Docker containers. This solves the functional problems of the shell executor described in the prior section because you get a clean environment prepackaged with your job’s dependencies. However, if the CI job needs to interact with Docker itself (e.g. to build Docker images and/or deploy containers), things get tricky. In order for such a job to run, the job needs access to a Docker engine or to use a tool like Kaniko that enables building Docker images without a Docker engine. Kaniko works well if the CI job only needs to build Docker container images from a Dockerfile. But it does not help if the job needs to perform deeper interactions with Docker (e.g., to run containers, to run Docker Compose, to deploy a Kubernetes-in-Docker cluster, etc.) For CI jobs that need to interact with Docker, the job needs access to a Docker engine. GitLab recommends two ways to do this: 1) Binding the host’s Docker socket into the job container 2) Using a Docker-in-Docker (DinD) “service” container Unfortunately, both of these are unsecure setups that easily allow the job to take control of the runner machine, as described below. Binding the host Docker Socket into the Job Container This setup is shown below. A sample .gitlab-ci.yaml looks like this: image: docker:19.03.12 build: stage: build script: - docker build -t my-docker-image . - docker run my-docker-image /script/to/run/tests As shown in the diagram above, the Docker container running the job has access to the host machine’s Docker daemon via a bind-mount of /var/run/docker.sock . To do this, just must configure the Gitlab runner as follows (pay attention to the volumes clause): [[runners]] url = "https://gitlab.com/" token = REGISTRATION_TOKEN executor = "docker" [runners.docker] tls_verify = false image = "docker:19.03.12" privileged = false disable_cache = false volumes = ["/var/run/docker.sock:/var/run/docker.sock", "/cache"] This is the so-called “Docker-out-of-Docker” (DooD) approach: the CI job and Docker CLI run inside a container, but the commands are executed by a Docker engine at host level. From a security perspective, this setup is not good: the container running the CI job has access to the Docker engine on the runner machine, in essence granting root level access to the CI job on that machine. For example, the CI job can easily gain control of the host machine by creating a privileged Docker container with a command such as docker run --privileged -v /:/mnt alpine <some-cmd> . Or the job can remove all containers on the runner machine with a simple docker rm -f $(docker ps -a -q) command. In addition, the DooD approach also suffers from context problems: the Docker commands are issued from within the job container, but are executed by a Docker engine at host level (i.e., in a different context). This can lead to collisions among jobs (e.g., two jobs running concurrently may collide by creating containers with the same name). Also, mounting files or directories to the created containers can be tricky since the contexts of the job and Docker engine are different. An alternative to the DooD approach is to use the “Docker-in-Docker” (DinD) approach, described next. Using a Docker-in-Docker Service Container This setup is shown below. A sample .gitlab-ci.yaml looks like this: image: docker:19.03.12 services: - docker:19.03.12-dind build: stage: build script: - docker build -t my-docker-image . - docker run my-docker-image /script/to/run/tests As shown, GitLab deploys the job container alongside a “service” container. The latter contains within it a Docker engine, using the “Docker-in-Docker” (DinD) approach. This gives the CI job a dedicated Docker engine, thus preventing the CI job from accessing the host’s Docker engine. In doing so, it prevents the collision problems described in the prior section (though the problems related to mounting files or directories remain). To do this, you must configure the GitLab runner as follows (pay attention to the privileged and volumes clauses): [[runners]] url = "https://gitlab.com/" token = REGISTRATION_TOKEN executor = "docker" [runners.docker] tls_verify = true image = "docker:19.03.12" privileged = true disable_cache = false volumes = ["/certs/client", "/cache"] The volumes clause must include the /certs/client mount in order to enable the job container and service container to share Docker TLS credentials. But notice the privileged clause: it's telling GitLab to use privileged Docker containers for the job container and the service container. This is needed because the service container runs the Docker engine inside, and normally this requires unsecure privileged containers (though there is now a solution to this as you'll see a bit later). Privileged containers weaken security significantly. For example the CI job can easily control the host machine’s kernel by executing a docker run --privileged alpine <cmd> where <cmd> will have full read/write access to the machine's /proc/ filesystem and thus able to perform all sorts of low-level kernel operations (including turning off the runner machine for example). Solution: Using Docker + Sysbox There is now a way to secure GitLab CI jobs that require interaction with Docker: using the new Sysbox container runtime. Sysbox is an open-source container runtime that sits below Docker (it’s a new “runc”) and enables deployment of containers that are capable of running systemd, Docker, and even Kubernetes inside, easily and securely. That is, a simple docker run --runtime=sysbox-runc <container_image> creates a container capable of running Docker inside natively, without any special images or custom container entry-points, and most importantly, with strong container isolation. By using GitLab with Docker + Sysbox, the CI pipeline security issues described in the prior sections can be resolved, as described next. There are a couple of approaches to do this, as described below. A Secure DinD Service Container The first approach is to use the Docker-in-Docker (DinD) service container described previously, but create the DinD container with Docker + Sysbox, as shown below. As shown, the runner machine has the GitLab runner agent, Docker, and Sysbox installed. The goal is for the GitLab runner to execute jobs inside containers deployed with Docker + Sysbox. This way, CI jobs that require interaction with Docker can use the Docker-in-Docker service container, knowing that it will be properly isolated from the host (because Sysbox enables Docker-in-Docker securely). In order for this to happen, one has to configure the GitLab runner to select Sysbox as the container “runtime” and disable the use of “privileged” containers. Here is the runner’s config file (at /etc/gitlab-runner/config.toml ). Pay attention to the privileged and runtime clauses: [[runners]] url = "https://gitlab.com/" token = REGISTRATION_TOKEN executor = "docker" [runners.docker] tls_verify = true image = "docker:19.03.12" privileged = false disable_cache = false volumes = ["/certs/client", "/cache"] runtime = "sysbox-runc" Unfortunately there is a small wrinkle (for now at least): the GitLab runner currently has a bug in which the “runtime” configuration is honored for the job containers but not honored for the “service” containers, which is a problem since the DinD service container is precisely the one we must run with Sysbox. As a work-around, you can configure the Docker engine on the runner machine to select Sysbox as the “default runtime”. You do this by configuring the /etc/docker/daemon.json file as follows (pay attention to the default-runtime clause): { "default-runtime": "sysbox-runc", "runtimes": { "sysbox-runc": { "path": "/usr/local/sbin/sysbox-runc" } } } After this, restart Docker (e.g., sudo systemctl restart docker ). From now on, all Docker containers launched on the host will use Sysbox by default (rather than the default OCI runc) and thus will be capable of running all jobs (including those using Docker-in-Docker) with proper isolation. With this configuration in place, the following CI job runs seamlessly and securely: image: docker:19.03.12 services: - docker:19.03.12-dind build: stage: build script: - docker build -t my-docker-image . - docker run my-docker-image /script/to/run/tests With this setup you can be sure that your CI jobs are well isolated from the underlying host. Gone are the privileged containers that previously compromised host security for such jobs. GitLab Runner & Docker in a Container The setup for this is shown below. In this approach, I deployed the GitLab runner plus a Docker engine inside a single container deployed with Docker + Sysbox. It follows that the CI jobs run inside that container too, in total isolation from the underlying host. I called these “system containers”, as they resemble a full system rather a single micro-service. In other words, the system container is acting like a GitLab runner “virtual host” (much like virtual machine, but using fast & efficient containers instead of hardware virtualization). Compared to the approach in the previous section, this approach has some benefits: It allows the GitLab CI jobs to use the shell executor or Docker executor (either the DooD or DinD approaches) without compromising host security, because the system container provides a strong isolation boundary. You can run many GitLab runners on the same host machine, in full isolation from one another. This way, you can easily deploy multiple customized GitLab runners on the same machine as you see fit, giving you more flexibility and improving machine utilization. You can easily deploy this system container on bare-metal machines, VMs in the cloud, or any other machine where Linux, Docker, and Sysbox are running. It’s a self-contained and complete GitLab runner + Docker environment. But there is a drawback: For CI jobs that interact with Docker, the isolation boundary is at the system container boundary rather than at the job level. That is, such a CI job could easily gain control of the system container and thus compromise the GitLab runner environment, but not the underlying host. Creating this setup is easy. First, you need a system container image that includes the GitLab runner and a Docker engine. There is a sample image in the Nestybox Dockerhub Repo; the Dockerfile is here. The Dockerfile is very simple: it takes GitLab’s dockerized runner image, adds a Docker engine to it, and modifies the entrypoint to start Docker. That’s all … easy peasy. You deploy this on the host machine using Docker + Sysbox: $ docker run --runtime=sysbox-runc -d --name gitlab-runner --restart always -v /srv/gitlab-runner/config:/etc/gitlab-runner nestybox/gitlab-runner-docker Then you register the runner with your GitLab server: $ docker run --rm -it -v /srv/gitlab-runner/config:/etc/gitlab-runner gitlab/gitlab-runner register You then configure the GitLab runner as usual. For example, you can enable the Docker executor with the DooD approach by editing the /srv/gitlab-runner/config/config.toml file as follows: [[runners]] name = "syscont-runner-docker" url = "https://gitlab.com/" token = REGISTRATION_TOKEN executor = "docker" [runners.docker] tls_verify = false image = "docker:19.03.12" privileged = false disable_cache = false volumes = ["/var/run/docker.sock:/var/run/docker.sock", "/cache"] Then restart the gitlab-runner container: $ docker restart gitlab-runner At this point you have the GitLab runner system container ready. You can then request GitLab to deploy jobs to this runner, knowing that the jobs will run inside the system container, in full isolation from the underlying host. In the example above, I used the DooD approach inside the system container, but I could have chosen the DinD approach too. The choice is up to you based on the pros/cons of DooD vs DinD as described above. If you use the DinD approach, notice that the DinD containers will be privileged, but these privileged containers live inside the system container, so they are well isolated from the underlying host. Inner Docker Image Caching One of the drawbacks of placing the Docker daemon inside a container is that containers are non-persistent by default, so any images downloaded by the containerized Docker daemon will be lost when the container is destroyed. In other words, the containerized Docker daemon’s cache is ephemeral. If you wish to retain the containerized Docker’s image cache, you can do so by bind-mounting a host volume into the /var/lib/docker directory of the container that has the Docker daemon inside. For example, when using the approach in section A Secure DinD Service Container you do this by modifying the GitLab runner’s config ( /etc/gitlab-runner/config.toml ) as follows (notice the addition of /var/lib/docker to the volumes clause): [[runners]] url = "https://gitlab.com/" token = REGISTRATION_TOKEN executor = "docker" [runners.docker] tls_verify = true image = "docker:19.03.12" privileged = false disable_cache = false volumes = ["/certs/client", "/cache", "/var/lib/docker"] runtime = "sysbox-runc" This way, when the GitLab runner deploys the job and service containers, it bind-mounts a host volume (created automatically by Docker) into the container’s /var/lib/docker directory. As a result, container images downloaded by the Docker daemon inside the service container will remain cached at host level across CI jobs. As another example, if you are using the approach in section GitLab Runner & Docker in a Container, then you do this by launching the system container with the following command (notice the volume mount on /var/lib/docker ): $ docker run --runtime=sysbox-runc -d --name gitlab-runner --restart always -v /srv/gitlab-runner/config:/etc/gitlab-runner -v inner-docker-cache:/var/lib/docker nestybox/gitlab-runner-docker A couple of important notes: By making the containerized Docker’s image cache persistent, you are not just persisting images downloaded by the containerized Docker daemon; you are persisting the entire state of that Docker daemon (i.e., stopped containers, volumes, networks, etc.) Keep this in mind to make sure you CI jobs persist the state that you wish to persist, and explicitly cleanup any state you wish to not persist across CI jobs. A given host volume bind-mounted into a system container’s /var/lib/docker must only be mounted on a single system container at any given time. This is a restriction imposed by the inner Docker daemon, which does not allow its image cache to be shared concurrently among multiple daemon instances. Sysbox will check for violations of this rule and report an appropriate error during system container creation. Conclusion If you have GitLab jobs that require interaction with Docker, be aware that these jobs have root-level access to the host on which they run, thus resulting in weak isolation and compromising the stability of your CI infrastructure and possibly beyond. You can significantly improve job isolation by using Docker in conjunction with the Sysbox container runtime. This article showed a couple of different ways of doing this. I hope you find this information useful. If you see anything that can be improved or if you have any comments, please let me know!
https://medium.com/nerd-for-tech/securing-gitlab-docker-ci-pipelines-56513f257806
['Cesar Talledo']
2020-11-09 07:41:52.793000+00:00
['Continuous Integration', 'Testing', 'Gitlab', 'Docker', 'DevOps']
There’s a difference between short-term pleasure and long-term satisfaction
For most of my adult life, I was seeking the quick fix — instant travel, moving on from one place to another in a heartbeat, parties, adventures, no commitments. Although, all I was really looking for was connection and stability. I didn’t realize then that I was on the wrong path to get it. For example, I was disappointed by others if they didn’t keep in touch when I was the one constantly hopping in and out of their lives. Don’t get me wrong, I’m not regretting anything; I love every bit of how my life has unfolded and stand behind every decision and lesson learned. What I only understand now, though, is that building a home and stability is just as risky as packing your bag, selling your stuff, and traveling the world. Not just in the physical sense, that’s the easier part, but in your mindset. Now, this is where it gets tricky because your mind creates a virtual security barrier to protect you from anything unknown that could potentially be dangerous to you. But you know what’s more dangerous? Your habit of staying in your comfort zone. So if you’ve gotten used to being on the road, constantly traveling, choosing a place to settle suddenly seems like the hardest decision in the world because you’ve found your comfort in being able to move on whenever you feel like it, no commitments. I was always looking for the perfect reason and conditions to “stay”, but to be honest, these conditions don’t exist. You need to create them willingly. It takes effort and dedication. You may find a place you like enough, with good people, and things that matter to you — and that’s about all you can wish for. From there on, it lies in your hands to build the life you’re longing to live. You can travel the globe and still not be satisfied with the options you’re presented with. I’ve been there. So believe me when I say, absolutely nothing is ever perfect. But here’s the good news: you can make it pretty damn awesome if you put in a little effort. The place, your work, your friendship circle, your partnership. You just need to stop overthinking and start moving. What are you waiting for? Nobody is gonna come and pick you up or show you the way. Not even the dreamy paradise beach will bring satisfaction if you don’t make decisions that will aid you in the long run. What has made a big difference for me was making sure to have my values in place and choosing to surround myself with people that I truly like rather than people I’m hoping to be liked by. There’s nothing to win from that and it will never make you happy in the long run. Find a partner that’s willing to discuss with you, support you, respect you and lets you be who you are and vice-versa. And those are also the kind of friends you should look for. Seeking approval from someone you admire for the wrong reasons doesn’t contribute to your happiness in any way. It only takes your peace away. And above all, get to know and take care of yourself. You think you already know everything about yourself? Dig deeper and you’ll be surprised what you uncover. Find a self-care routine you like and stick to it. It may be hard to push through in the short run, but I promise you, the long-term results will be undeniable. I tell myself every day: stop making excuses Vanessa, let’s just go. Some days work better than others, but hey, a little progress every day goes a long way. Don’t wait for the perfect moment; the time to get started is always now. And if you don’t know how to get started, get in touch and we’ll figure it out together. It’s easier than you think; you only need the courage to get started. Vanessa Find me here: hello@decodewithsound.com www.decodewithsound.com www.instagram.com/vanessa.decodewithsound
https://medium.com/@vanessa.naumann/theres-a-difference-between-short-term-pleasure-and-long-term-happiness-6c4bdf56a82
['Vanessa Naumann']
2021-05-04 17:33:51.110000+00:00
['Courage', 'Self Growth', 'Self Care', 'Mindset', 'Values']
Compartmentalize & Context Switch Like A Pro
Like it or not, we are constantly multi tasking at work. You might be in a meeting and a Chat message pops up communicating a piece of news, distracting you from your meeting and getting you thinking about the piece of news you just received. Or you wrap up a difficult meeting but the bitter taste from the meeting lingers a bit longer. Life is a series of mini-transitions like these. If these transitions are not managed well, they have the real potential to derail you and cause more damage to your brand and reputation than you can anticipate. I ran into this and it took me a painful time to discover the unfortunate side effect I had caused, because I was unable to recognize and make the transition to the next meeting. Let me share my story. Photo by MARK ADRIANE on Unsplash I had finished a particularly contentious meeting. It was a heated discussion, and I didn’t like where the direction was going. That meeting ended, and I walked to the next conference room, still simmering and stewing about what happened. My next meeting was a 1:1 with someone who was based in a different location, in a different country, and was visiting our office for work. Let’s call her Holly. Holly had made the time to reach out to me and talk through a few agenda items. Unbeknownst to me, I later realized, I was not warm, I was not smiling and just engaged with her in a matter of fact way. I offered lukewarm support for her problems vs leaning in and helping. The meeting ended as scheduled, and I went on with the rest of my day, I didn’t think about the just concluded 1:1. A few months later, one of my co-workers approached me and asked why I didn’t like Holly. I was shocked! I tried to think about my interactions with Holly — I had not met her that often and in the few times I had seen her work, I was actually impressed with how thoughtful she was and it showed in her work. I felt I had complimented her positively in emails as well, so I was at a loss for where this misconception was coming from.I started to dig in and understand — soon it became clear that my lack of warmth in that 1:1 was the root cause. If this co-worker had not approached me and asked me, I would have never known and wouldn’t have had an opportunity to correct the erroneous notion. I reached out and sorted things out positively with Holly, but it left a lasting impression. These mini events that we experience throughout the day are important to manage, and we need to reset and start every new meeting or interaction with a positive mindset. Easier said than done, right? Here is what I have tried to do, to hit reset and start every meeting afresh. Have meetings snap to the Pomodoro technique: A Pomodoro (Italian for tomato) is a time management and focus technique that requires you to focus on one task uninterrupted for 25 minutes and then take a 5 minute break. If you are hosting or running the meeting, keep the agenda to 25 minutes so you can use the 5 minutes as a break to reset your mind before the next meeting or activity. Say nothing or stay neutral until you have had time to process It is tempting to fire off messages — digitally or verbally to someone as a way of blowing steam after a bad meeting. Or someone might ask you something innocuous like “what did you think of that meeting?”. Resist the temptation to speak your mind. I use stock responses like “Interesting meeting, definitely needs follow up and more work to be done”. Getting 5 minutes to transition between meetings might be a luxury — in these days of video meetings, switching meetings happens a lot more quickly than the previous walk to a conference room. So have a quick reset routine that is easy to implement in 30 seconds or less. Here are my go-tos as examples. Breathing exercises — a quick breathing exercise can quickly reset and pep up your mood. Draw something. I draw an airplane (I am good at it) with my pencil as a go to thing. It is quick, I draw it well and helps me calm down. Watching a funny meme on my phone and in some cases, saying a quick prayer. It breaks the “overthinking loop” and gets me moving in a positive direction. Fix yourself a drink or snack — it switches your focus. Clean up or organize your desk. A quick clean up has you more organized and gives you a great way to feel more productive. List of quotes — I have “pick me up” or go to quotes that help me snap out of my current thinking and look ahead. All we need is a reminder. One of my favorite quotes is “There are only 2 rules in life.” Rule 1: Don’t sweat the small stuff. Rule 2: It’s all small stuff. Smile to yourself before you start your next meeting. Before you walk in through the conference room door, or click join in your video conference meeting, just smile. It helps. For video meeting pointers, please read this. I hope you find these techniques useful. Stay positive and build your brand at work by managing these micro-events in your life.
https://medium.com/@karthikln/3-techniques-to-handle-negative-events-at-work-8373196311f8
['Karthik Lakshminarayanan']
2021-02-14 21:18:05.934000+00:00
['Positive Thinking', 'Workplace']
‘Wonder Woman 1984’ grabs $38.5 million overseas
This image released by Warner Bros. Entertainment shows Gal Gadot in a scene from “Wonder Woman 1984.” The superhero sequel earned an estimated $38.5 million in ticket sales from international theaters, Warner Bros. said Sunday, Dec. 20, 2020. (Clay Enos/Warner Bros. via AP) The superhero sequel “Wonder Woman 1984” has earned an estimated $38.5 million in ticket sales from international theaters, Warner Bros. said Sunday. The film starring Gal Gadot started its rollout abroad last week, opening in 32 markets including China and playing on upwards of 30,000 screens. The studio said admissions totaled over 6 million and that the largely positive reviews bode well for its future. Most of the earnings came from Chinese theaters, where it earned an estimated $18.8 million. It wasn’t enough to take first place in the country, however — that honor went to a local release. “ Wonder Woman 1984 ” won’t open in U.S. theaters until Christmas Day, when it will also debut on HBO Max. The pandemic has forced studios like Warner Bros. to embrace unconventional release plans to get films out to audiences. Originally, “Wonder Woman 1984” was supposed to open in theaters worldwide this summer, but its release date kept getting pushed back. The first film made over $821 million worldwide in 2017 and absent the pandemic, the hope was that the sequel, which cost around $200 million to make, would even surpass that total., , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , Warner Bros. has tried a few different strategies with its films this year, including releasing “Tenet” in theaters and sending “The Witches” straight to HBO Max. Finally last month the studio decided to embrace a hybrid release for “Wonder Woman 1984.” “Tenet” opened internationally first as well, taking off with $53 million from 41 markets in late August. , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # #
https://medium.com/@bmohan-sanjeewa-o/wonder-woman-1984-grabs-38-5-million-overseas-1204cbd2057d
['Bmohan Sanjeewa O']
2020-12-22 04:39:12.598000+00:00
['Entertainment', 'Artist', 'Beauty', 'Fans', 'Startup']
Positive Juice: Something Right About Being Wrong
“Error”. You might have already encountered this word, specifically, when your strict college professor was searching for statistical data of your thesis results. A memory you just want to forget, right? Don’t worry, this is not a scientific paper. But you know, in research, errors are normal. As a matter of fact, having no errors is relatively unusual, suspicious, and sometimes — even funny. Being absolutely accurate about something can be quite difficult to achieve, and for some cases, IMPOSSIBLE. So why am I talking about errors? Few days ago, I was scrolling through my social media feeds and came across a post saying, “One of the worst things that could happen in the middle of an argument is when you realized that you are actually wrong.” A bit funny, right? Nobody wants to be wrong. Even if one already realizes that, he/she will find clever ways to justify that he/she is right. Whatever it takes. Some are talented like that. You might have done this at one point as well. No matter. It does not change the fact that, sometimes, we can be absolutely wrong. Or in other words, whether you like it or not, we just can never be always right. So, why are we so afraid of being wrong? When it is normal? I will give you some possible reasons why. 1) We associate it with failure. 2) We start thinking that we might have some mental defects. 3) We are afraid of being judged in this “perfect” world. 4) We strive for perfection with our intergalactic standards. Who is guilty of some of these? No one? Maybe, it is just me then. Bottom-line, being wrong feels TERRIBLE. And at times, we regard it as something permanent. Since childhood, every time we get a poor score on some test, it makes us feel defeated, ashamed, and insecure. I mean, you cannot even stand to look at it for too long like your fear of looking at people with sore eyes. You just don’t need that kind of negativity, right? So, you crumple it and before you know it, it’s travelling in a projectile motion until gravity finally rests it on — surprise, surprise — the GARBAGE CAN. We just throw it away and hope our depressing feelings will be gone with it as well. To some extent, that piece of paper sometimes even serves as our identity. We are being judged by numbers or for others, letters. And so, we hide it together with our broken confidence. You see, we are programmed to do things perfectly. There is nothing wrong with that. But, how about being programmed to handle otherwise as well? We also need that. I am not a psychologist, therapist, or some behavioral expert but I can relate to what you are feeling simply because we are just the same — people who just want to make ourselves and our lives perfect and happy, whatever that means for every one of us. We want to think and do things right all the time. This fear of messing up that initially drives us to perform and be the best version of us may go the other way and lead to egoism and arrogance instead, if not careful. Anyway, what I can share to you as someone just like you is that, I have been in that position too for several times with various people. Being exposed in the research environment gives you the privilege to meet brilliant minds. Intimidating but, brilliant. It is just not hard to be in a position where you can be wrong. And, it took several wrongs for me to actually feel right about it. Ironic, isn’t it? Forgiving yourself for falling short of your own expectations and others can be so difficult and depressing. But, setting aside all the negatives now (unless you want to keep it), let’s talk about the positive juice of being wrong. Being wrong allows us to find again within us a sense of humility. It allows us to improve ourselves through every correction we welcome so we can become better and wiser individuals for the next millisecond. Because for every wrong, there is a corresponding right. Thus, logically, every time we’re wrong, we can subsequently gain something right, right? It is something we can take with us but only if we allow it. It can be quite challenging to see the silver lining in the clouds of fear, but, at least, being wrong is not something we experience alone, if that provides any consolation. Just like the song lyrics say, “Everybody makes mistakes. Everybody has those days.” The moment you realize you’re apparently wrong, you should probably just have to say it. If you can joke about it, then you’ve just become the coolest person ever. Nothing messes things up more than proving you’re right when everybody already thinks you’re wrong. It is time to listen. Committing mistakes is inevitable and universal. There are mistakes we can easily admit and some with difficulty. But the fact that you are open for it already says something exquisite about you because if you think you are right all the time could be dangerous also, not to mention, annoying. Do you agree? Your ability of not being able to receive new information may cause people to lose interest talking with you because you will probably make them feel as if their beliefs, no matter how true and sensible they are, are worthless. Imagine that with someone who thinks they are right all the time. You see, admitting you’re wrong requires an act of bravery and not all can do it. It takes a great deal of honesty, maturity, and humility to accept and utter these rather simple but beautiful words, “You’re right. I was wrong.” Not only you are able to acknowledge other people’s contribution but allowed yourself to become a better human being. Just as another popular saying goes, “You learn nothing from life if you think you’re right all the time.” Lastly, before I end this, I believe that our hesitance to admit we’re wrong stems from the struggle to maintain our rather already wavering self-confidence. But, I also believe, that true self-confidence isn’t measured when everything is right. It is when something is wrong but you are able to receive, reflect, and renew your understanding with dignity for you know, that accepting that temporary state of wrongness would lead to a permanent state of rightness. Stay classy. “…accepting that temporary state of wrongness would lead to a permanent state of rightness.”
https://medium.com/@my.self.ish.thoughts/positive-juice-something-classy-about-being-wrong-868af9509314
['M.B. Saniel']
2019-10-19 04:16:17.260000+00:00
['Self Improvement', 'Positive Thinking', 'Life Lessons', 'Positivity', 'Life']
I’ve Moved 4 Times In 4 Months During A Pandemic & I Nearly Fell Apart
By Vicky Carter PHOTOGRAPHED BY MEG O’DONNELL. When you’re a renter, moving house is exhausting any time of year but moving four times in four months during a pandemic nearly broke me. I am not alone. Renting has long been a traumatic experience for many but this year the global coronavirus pandemic has really brought home just how much uncertainty and instability we private renters face. According to a survey conducted by the housing charity Shelter in June, nearly a third of renters — that’s 2.7 million adults — said they felt more depressed and anxious about their housing situation than ever, and the same number said they were having sleepless nights. Anyone who has ever viewed properties in London (where I live) knows that it is the eye of the storm. With London being the sixth most expensive city in the world to rent in, not only are you bidding against a sea of interested and equally frantic people, you’re also bidding on an overpriced room that, if you are lucky, has average connections to public transport. Housing stress is real at the best of times. It is defined as the experience of unstable or unaffordable housing. The Health Foundation states that “it is important for our health and wellbeing that our homes provide our needs, make us feel safe and allow us to stay connected to our community. Experiencing housing insecurity, including unaffordability, short and unstable tenancies, and overcrowding can also have a negative impact on our health.” But doing this right in the middle of a global pandemic reached new levels of stress for me. I would look at every surface and fear that the deadly virus could be lingering there. What if I picked it up? Worse still, what if I had no symptoms and infected my vulnerable mother or elderly neighbour? On top of that, there’s now so much more to think about. Is there enough room to fit a desk for me to work from home? Will I get on with the housemates if we face another lockdown? Will they abide by the rules and not risk my health by breaking them? I would look at every surface and fear that the deadly virus could be lingering there. What if I got it? Worse still, what if I had no symptoms and infected my vulnerable mother or elderly neighbour? My first move in June was from somewhere I had been living for a year which was easy to commute to work from. Although the house was gorgeous, the ridiculously high rent coupled with a neighbour who harassed me throughout the first lockdown in March made me want to tear my hair out. As soon as my role changed and I could work from home, I looked at properties that were closer to friends and family. I chose a place that was conveniently on the same street as a few of my friends. Sold by its large rooms and cheap rent, I moved in. Despite having only two months left on its original tenancy, I was told by the housemates that we would renew the tenancy once it was up. I moved in and began buying houseplants, putting up pictures and making my room feel as homely as possible. It has always been important for me to be able to create my own space; to turn where I am living into a sanctuary so that I can rest in a safe environment. Integrative therapist Abbey Robb tells me that we cannot underestimate the importance of the human need to nest. “The ability to adapt work and living spaces to suit changing demand is an important part of personal autonomy,” she explains. “In a year where so many things have been taken out of people’s control, being able to rearrange and reorganise spaces to make them suitable to live and work in gained more importance than normal.” Sadly, a month into the tenancy, I was told by two of the housemates that they would be leaving in one month when the tenancy was up. Clutching at straws, I began to search for individuals who could replace them and fill their rooms. Two weeks later, 10 viewings had fallen through and my landlord told me I had a fortnight to find somewhere new to live. It was now August. I was in a constant state of adrenaline and anxiety. I would wake up at 5am, wide-eyed with stress and surrounded by my life packed up in boxes, searching endlessly on my phone for suitable properties. Notifications from potential flatmates and letting agents would disturb me throughout the day while I was at work. Thinking I needed to reply straightaway to have a chance of moving in, I would be organising new viewings constantly. After work I viewed properties, one after the other. It was a cycle of fake smiles, trying not to touch anything, repeating the same questions over and over again before passing out late at night, exhausted, with a takeaway next to me. Soon the 21 viewings merged into one as I tried to fight off an impending breakdown. After work I viewed properties, one after the other. It was a cycle of fake smiles, trying not to touch anything, repeating the same questions over and over again before passing out, exhausted. I moved into the next property — another house share — just as my old tenancy was up in August. It seemed to be a right fit but as I slowly unpacked my life and days passed, I realised that in my desperate and sleep-deprived state, I hadn’t noticed that one of the walls was missing. Where it should have been was a set of glass doors separating my room from the living room, hidden behind a pair of curtains. This meant I had little privacy. I called the estate agents out. They said I had to leave if I wasn’t planning to sign the contract. I had 12 hours to pack up my life once again. Three car trips and six hours later, I had crammed my possessions (now including a new desk and overgrowing plants) into my mum’s flat. Although she was delighted for me to be home, at the age of 28 I felt I had taken two steps forward and three steps back. I looked out of my bedroom window at the sleepy town I had returned to in the southwest of England and immediately felt claustrophobic. With my mum being a vulnerable person who needed to shield, I isolated myself from her for two weeks. Although living rent-free was wonderful, I felt overwhelmed with concern for her health and the psychological impact of losing the freedom and autonomy I had just weeks ago was taking its toll. One negative coronavirus test later, I continued the house hunt, albeit this time predominately virtually. I was able to be more specific in my choices as there was no clock ticking down against me. Eventually, in September, with my mental health hanging by a thread, I found somewhere which had no creepy neighbours, a signed tenancy of at least six months, access to green spaces, and all four walls. Now, as I write this on my new desk, surrounded by my overgrowing plants in a house with charming Irish nurses and their sassy cat, looking back I realise how unwell I was. My body was constantly running on adrenaline for an entire four months of 2020. My mental health, my wellbeing and my working life have all benefited from finding stability. For the first time since living in London, I feel like I live in a home rather than a house. Some things are worth waiting for but one thing has been made clear to me through this experience: the constant revolving door that is renting has a massive impact on people’s mental health and I wish there were more awareness of that.
https://medium.com/refinery29/ive-moved-4-times-in-a-year-i-nearly-fell-apart-4d9d3c6dfe20
[]
2020-12-27 16:27:33.161000+00:00
['Housing', 'Pandemic', 'Renting', 'Moving', 'Covid 19']
The Secret to Writing Texts That Sell
There’s a ton of advice out there on how to write “selling texts.” What we are offering you is not just another scheme or a set of guidelines. We will tell you what works in practice, and why it does. It’s very simple, but it can be revelatory for you — just as it was for us. Still, we do have to start with schemes and guidelines. This will help you understand which paragraph contains the information that makes sure your text sells. So let’s have some theory, but not too much: we don’t want to bore you. A selling text is basically advertising copy built according to a specific algorithm. It showcases the advantages and benefits of the product (or service), helping the customer solve their problems and ease their pains. A selling text turns a random visitor into a client. Selling text algorithm Stop! The headline grabs the reader’s attention and makes them want to learn more about the subject. 2. Emotions The part of the text that contains the most important focus: the customer’s problem and “pain,” as well as ways of solving it. This should be presented in a way that triggers an emotional response. 3. Logic Logically laid out reasons why your client needs to solve his or her problem. 4. Facts & proof The part of the text that demonstrates and proves the necessity of purchasing the product or service. 5. Forestalling objections The text that foresees and overturns any objections the client may potentially have. 6. Guarantees A promise of excluding the risks. 7. Limited options Restrictions such as “Discounts effective only through December 25,” “Stocks are limited,” and similar tricks telling the customers to hurry up. 8. Call to action The final chord of the text instructing the client what exactly they have to do. Selling text formula The basic formula of the marketing copy has been unchanged since 1898. It’s the widely known AIDA model: In theory, it looks nice and simple: you see, you desire, you buy. (Or you write a selling text, publish it, and wait for the customers to show up in droves.) Piece of cake, right? Not really: in practice, lots of people just go to the website, skim through your immaculately written text and… leave. So what’s wrong? And can you do anything about it? Before you start calling all the marketing experts, copywriters and analysts, make sure you haven’t missed the most important thing. Visualization People naturally tend to “try on for size” everything they encounter in the world around them. If we can’t picture something, we ignore it, as it doesn’t really make a dent in our thinking and doesn’t affect us emotionally. Reread your text carefully. Does your description of the offer plant an image in the reader’s mind? Can the reader picture him- or herself owning your product? Here’s a good example. The same watch is offered by two similar online stores at the same price. Here’s the first offer: And here’s the second one: The second store sells almost twice as many watches as the first one! Why? Because when a customer sees the hand wearing the watch he visualizes his own image, picturing the watch on his own hand. And it looks good! The same watch in the first photo doesn’t trigger these feelings. People still buy it, but mostly because they have already seen someone else wearing it or because they’ve encountered a more visual ad. But what if your product is new and not yet popular? And no, this is not about adding a photo to your text. This is about the text itself. If you can make it graphic enough so that your customers can picture themselves enjoying your product, a high conversion rate is practically guaranteed. Like a smart salesman’s speech, a good selling text is full of vivid, convincing imagery. Listening to it or reading it, customers can literally see “before” and “after” pictures of themselves in their mind. A visual image pushes people to make the decision. Research by Wolfgang Köhler, one of the founders of Gestalt psychology, has shown that people form associations and images even if they don’t understand the gist of what they’re being told. We constantly create pictures in our minds. A selling text is a tool that helps customers see the right picture, one that’s desirable and attractive (or revolting, as the case may be). Images form at the D level of the AIDA pyramid. Or they don’t. All the proclamations of “Hurry while stocks last,” “Time-tested quality,” “We care about you,” and so on will leave your audience cold and uninterested unless you manage to create an attractive image. Conversely, if you’ve been successful in creating a vivid image and the customers have “tried it on,” there’s a lot they’ll be willing to overlook in your text: typos, filler passages, and other assorted sins. Nothing will stop them from getting what they want if you’ve managed to implant a craving for it by using visualization. Creating images There are some selling propositions that don’t seem, at first glance, to be conducive to vivid imagery. For instance, a store that sells auto parts or components. But even in this case, it’s a good idea to use images. “Our bearings roll like cheese in butter” may sound ridiculous, but it’s much more effective than “Our bearings are the best.” Even a silly-sounding advertisement can be a memorable attention grabber. Emotional language To make someone interested and willing to go the distance, you must affect their emotions. An emotional response is instantaneous and intuitive, with logic and reasoning always lagging behind. Only strong verbs, associations, and metaphors can carry an emotional charge. Adjectives are pointless filler that will only drag your text down. Sensual language This is a universal language that’s understood by nearly everyone. It’s based on associative thinking and can always be counted on to work unless you overdo it. A good example is the Woodbury Soap Company’s slogan: “Skin You Love to Touch.” Coupled with a touching picture of two young people locked in an embrace, it was a brilliant marketing ploy. Soap sales rose tenfold. Sensual language should be used appropriately and sparingly because vulgarity doesn’t help sales. Vulgar language may attract 10 people but scare away 100. Logical language You might think it’s about being logical in your wording. While that’s an obvious requirement, it’s not what we mean here. A selling text should engage the reader logically. It should be logically convincing to make them do what you need them to do. Target audience language A selling text is written for a specific target group (core audience). And every group speaks its own language. Before you start writing your copy, take some time to study your audience, its problems, desires, ambitions, and customs. Otherwise, you will be neither heard nor understood, even if you follow all the rules for writing “texts that sell.” This also applies to the image you’re creating. If you know your target audience well, you won’t have any trouble creating an image that speaks to it directly. Keep in mind, for example, that a female audience is more emotional while a male audience is more logical. What’s exciting to young people may be shocking to the elderly. And so on. The image you’re creating should be based on marketing analysis, not on your writing ambitions. Only then will you get a gratifying response from your audience. This is a key point, so beta testing would be a great idea at this stage. A well-chosen image gets you high conversion rates and vice versa. This is the most important part of copywriting. You don’t need to be a good writer to write a selling text. But it’s essential to get a measure of your audience and offer it exactly what it wants. A selling text is not an end in itself or a collection of pretty words; it’s a tool for creating a bond with your specific target audience. Highlighting advantages Every target group has its own values, desires and ambitions, which means they have different motives for performing the desired action. A middle-aged conservative man will not be interested in things that fascinate a young and curious novelty seeker. So the text should highlight different advantages depending on the specific audience. Understanding your target audience’s values and desires helps you offer specific advantages to each group. How to Improve Conversion Rate? Secrets of Neurolinguistics Reviews When choosing reviews of your product, it pays to follow the same visualization principle. Some people confuse emotional reviews with graphic ones. “Wow, these shoes are so cool!” is emotional — and also completely useless. “My feet don’t get tired in these shoes, even though I’m a courier and have to walk a lot” is much better. Here, we have a clear image that anyone can appreciate by imagining themselves wearing uncomfortable shoes. Graphic reviews always elicit a better response from users. Be reasonable Being reasonable means you don’t have to blindly follow the rules. Your business may be special. Textual visualization will be pointless if you need to describe the product’s technical specs. In this scenario, your audience expects concrete facts and numbers. So, describe the product’s features using logic and proof instead of pictures. Creating mind images and pictures should be appropriate. In this case, a photograph of your product accompanied by a list of technical specs is the best image. Text placement When writing your selling text, remember it will be broken down into blocks and thus read in separate chunks. It’s essential to maintain a logical connection between the chunks/blocks to ensure uninterrupted flow of information. Here’s a sample text placement in blocks: Whatever template you use for your selling text, the copy itself must maintain internal logic, structure, and consistency. Namely: Headline A striking, attention-grabbing, informative headline should ideally showcase the advantages of your offer. Lead This is the introduction that contains triggers to instantly interest and intrigue the user. Offer (main body of text) An enticing offer with rich imagery. Reasoning and forestalling objections. Price, guarantees, incentives. This is typically a psychologically unified block visually divided into chunks. Create an image that can be owned! Limitations / Bonuses The information that your offer is valid for a limited time only should encourage the customer to make the decision. Incentivizing bonuses, discounts, and giveaways should also go in this block. Call to action An explanation of what steps the user has to take to get what they want. The layout problem Unfortunately, a bad template, i.e. a poor website layout, can be very detrimental to the content. Always try to place the text in keeping with the optimal sequence, and make sure designers and developers are on the same page. Ideally, the content should be created before design and development, but this is not always viable in practice. So you’re forced to fit your text into an existing layout. Bad zoning hampers readability. An unprofessional design does not take text readability into account, which can make even the best-written selling text a commercial failure. A text can be ruined if its logical structure is broken up to fit the design, if the font is unreadable, or if there are distracting design elements. Summary
https://medium.com/outcrowd/the-secret-to-writing-texts-that-sell-a3600eca9425
[]
2020-10-15 08:04:41.068000+00:00
['UI', 'Internet Marketing', 'Selling', 'Marketing', 'Web Design']
How Text Marketing Help Your Business in a Post-Pandemic World?
How Text Marketing Help Your Business in a Post-Pandemic World? Pandemics threaten the global economy and kill some businesses. For instance, many companies laid workers off or went bankrupt a few months after the declaration of COVID-19 as a pandemic. One reason is the COVID-19 control measures discourage human-to-human contact, reducing foot traffic in stores. Pony Express HQ Dec 26, 2020·3 min read For businesses to survive pandemics, they have to change their mode of operation and marketing strategies. Text marketing is an excellent way to build and personalize customer experiences in a post-pandemic world. We’ll explain how it works. How Text Marketing Help Your Business in a Post-Pandemic World | Pony Express HQ Personalized Conversations for More Sales Smartphone use is continually growing, with the average American checking the phone about 96 times a day. Theoretically, you have 96 chances to get in touch with customers through text marketing for business. This opportunity is enormous, but does using text marketing for business bring any unique benefits? Text messaging lets you foster relationships with consumers without sounding pushy. Those who have interacted with your brand are more likely to embrace the offers you present and eventually buy what you pitch. Use personalized text messages to build a rapport with your customers and sell your products through friendly conversations. 💬 Communication About Order Status In e-commerce, communication is vital. Clients can get apprehensive if they can’t get information about their orders after buying products online. Text marketing for business is a perfect way to keep customers updated when purchasing a product until they receive it. Once you receive an order, update the buyer at every stage of the fulfillment process. Let them know when you ship, when they can expect it, and when it reaches their address or collection point. Additional alerts, like unexpected delays, are also crucial. The customer will rest easy knowing the product is on the way. ✅ Offers, Promotions, and Discounts Text marketing for business is a useful tool for integrating your physical shop with online stores. Through geofencing, you can send relevant and timely promotional text messages to consumers within a given region. Geofencing will help you target potential customers close to stores with active offers. Automated Sales and Reminder Companies that offer subscription services can utilize text marketing to alert customers about their renewal dates. You can use the same marketing strategy to remind consumers who buy particular products regularly to order replenishments. Lens Direct, an online retailer that sells contact lenses, has a marketing program that utilizes text messages. Customers can take pictures of the items they want to buy and place orders via SMS — text messages. On the other hand, the merchant can send low-supply text alerts to customers when running out of stock. Feedback Via Text Marketing for Business An unquestionable way to weigh customer satisfaction is by asking the customers themselves. Text marketing for business gives you an ideal method to collect reactions. Run surveys asking your customers to comment on your service and rate it. Collecting feedback helps you to improve your products and services and shows clients that you care about them. They cultivate a positive perception of your brand if you implement or respond to their recommendations, heightening customer loyalty. Improved Customer Service If your marketing strategies are deficient in customer service, you cannot achieve excellent customer experiences. Many customers don’t call support desks, especially if they anticipate to stay on hold. Text marketing for business can enhance your customer service by allowing clients to send queries via text messages instead of calls. What’s more, texting is affordable and quick, and senders can use it anywhere. It reduces the time for resolving issues, subsequently increasing customer satisfaction. 💙 Reliable Text Marketing Platform for Businesses Are you looking for a versatile text marketing platform to build customer experience and engagement? Are you keen on discovering more about how text marketing to help your business in a post-pandemic world? Pony Express HQ facilitates personalized marketing through SMS templates, MMS/picture text messages, scheduled alerts, custom auto-respond messages, cross-posting on social media, and so much more. Sign up for free today, get 100 bonus text credits! The Pony Express HQ Team
https://medium.com/@ponyexpresshq/how-text-marketing-help-your-business-in-a-post-pandemic-world-fc2f27d38983
['Pony Express Hq']
2020-12-26 07:50:03.835000+00:00
['Marketing Strategies', 'Covid 19', 'Sms Marketing', 'Business Strategy', 'Small Business Marketing']
Worms, Snail Divers, and Scar Tissue: Tackling the Mystery of Liver Fibrosis
Bob Shaban, 37, holds a handful of freshwater snails pulled from the banks of Lake Victoria in Uganda. These small snails often harbor schistosomes, a parasitic flatworm that thrives in bodies of water contaminated by the feces or urine of an infected host. These worms can penetrate skin, leaving those who swim, bathe, or wash their clothes in contaminated water vulnerable to infection. Photo credit: WHO Lurking in fresh-water rivers and waterways throughout the world is a type of small worm, called schistosomes, that live with and rely upon specific species of aquatic snails for survival. Highly prized for food, these snails are often harvested by divers, exposing the divers and anyone else who bathes or washes clothes in the river to the worms’ larvae. The larvae can easily penetrate skin, and even a short exposure to infested water can lead to infection. Infection by schistosomes is called schistosomiasis or bilharzia, and it is particularly prevalent in tropical and subtropical areas of the world, primarily in communities without access to safe drinking water and adequate sanitation. Schistosomiasis affects almost 240 million people worldwide, with more than 700 million living in endemic areas. Around 90% of those requiring treatment for schistosomiasis live in Africa.¹ Schistosomiasis can usually be treated successfully with a short course of medication called praziquantel that kills the worms. The treatment is unfortunately not readily available or sought-after due to lack of awareness in many parts of the world.² Schistosomiasis affects almost 240 million people worldwide, with more than 700 million living in endemic areas.¹ Different types of schistosomes are present in different geographical regions, and drive different types of disease. Urogenital schistosomiasis is caused by Schistosoma haematobium and intestinal schistosomiasis by any of the organisms S. guineensis, S. intercalatum, S. mansoni, S. japonicum, and S. mekongi. Our study focuses on S. mansoni infections. Image credit: Nature Once an individual is infected, the worms mature in the bloodstream and lay eggs, which sets off a chain of events that can culminate in a type of permanent scarring of the liver, called fibrosis. Fibrosis, the laying down of an excessive extracellular matrix, is a body’s healthy response to an injury — and is required to heal wounds. But when fibrosis does not resolve, the excessive build-up of scar tissue in any organ can contribute to an inability of that organ to function. Fibrosis can occur in many different types of disease, and across many different organs — with lungs, skin, kidney, heart, and liver being some of the most problematic. Liver fibrosis is the primary pathology in various forms of liver disease — alcohol-induced cirrhosis of the liver, non-alcoholic fatty liver disease (NAFLD), and non-alcoholic steatohepatitis (NASH), and is also associated with various viral infections of the liver (e.g. Hepatitis B, C). To date, no treatments have conclusively shown the ability to reverse or halt fibrosis of the liver. The life cycle of S. mansoni. Image credit: Nature To tackle this area of unmet medical need, Variant Bio has partnered with Bilhi Genetics, founded by Dr. Alain Dessein, a pioneer in the genetics and treatment of fibrosis. Together, we aim to identify genetic risk factors for severe liver fibrosis among a particular community in Uganda. The participants were recruited from two settlements in the West Nile region with a very high prevalence of schistosoma parasites. They were recruited based on years of exposure to the parasite (bathing, fishing) and their livers were evaluated for fibrosis progression. But fibrosis does not impact everyone equally — why do some infected with schistosomiasis develop severe, debilitating liver fibrosis, while others do not? Are some individuals more genetically predisposed to fibrosis than others? Intriguingly, a subset of the population we are studying in Uganda seems to be resistant to rapid fibrosis progression, despite being infected by schistosoma parasites. The underlying biology of fibrosis progression is known to be similar in schistosomiasis and diseases such as NASH. Therefore, our hope is that the genetics of these resistant individuals will help us uncover why some appear to be protected from an ongoing fibrotic response. Our goal is to identify novel, large-effect size, genetically validated targets for liver fibrosis, which will allow us to develop new drugs for diseases such as NASH and NAFLD. Dr. Alain Dessein and his team have spent years working with communities along the banks of Lake Albert in Uganda and across the world, including in China, Sudan, Mali, and Brazil, setting up annual clinics to diagnose, treat, and monitor schistosomiasis. As he has noted, “Schistosomiasis, and the fibrotic response that accompanies it, is a profound problem in these communities. Understanding factors for genetic susceptibility to liver fibrosis will undoubtedly lead to more and better therapeutic options for them.” Variant Bio will apply our expertise in analyzing whole-genome and whole-exome sequencing data to the samples sequenced by the Bilhi Genetics team. By including individuals with similar levels of exposure to the parasite, who were recruited by Dr. Dessein and his team, we can identify those with rapidly progressing and severe fibrosis and compare them to those with slowly progressing fibrosis. Analysis of the genetics of these two groups will enable the identification of genetic variants that put individuals “at risk” of severe fibrosis, as well as those that protect from fibrosis. We expect these findings to help us understand some of the fundamental biology underlying the pathogenesis of liver fibrosis, and also to enable a drug discovery research effort, driven by human genetics. The common goal for Variant Bio, Bilhi Genetics, and the individuals taking part in the study is the development of better therapeutics for liver fibrosis, and better outcomes for the patients affected by this disease. As someone who has worked in drug discovery focused on fibrotic diseases for many years, the importance of research into fibrosis is difficult to overstate — up to 45% of deaths in the industrialized world can be attributed to fibrosis.³,⁴ We believe that understanding the genetics of schistosomiasis-induced liver fibrosis will direct us towards a cure not just for schistosomiasis, but for many different forms of fibrosis.
https://medium.com/variantbio/worms-snail-divers-and-scar-tissue-tackling-the-mystery-of-liver-fibrosis-d46f9faee45f
['Heather Arnett']
2021-09-01 16:11:37.036000+00:00
['Genetics', 'Uganda', 'Variant', 'Liver Fibrosis', 'Schistosomiasis']
Extreme Learning Machines
Extreme Learning Machines III Part III: Is it better? Well, it depends. Like said, it has the main advantage of the smallest training time and error and better generalization performance. ELM has the simplest algorithm as we do not have to decide the number of hidden layers, learning rate, and other hyperparameters. Even being simpler, ELM outperforms any other algorithm in terms of accuracy, precision, and recall. But ELM architectures mostly end up with more number of hidden nodes for the first layer which affects the test time. If the application of the model does not demand lower training time but requires faster results as its priority, then ELM should not be your first choice. For example, ELM hasn’t turned out good in real-time image classification. Comparison from source [3] Comparison with LSTM and HTM OR-ELM used here is a type of ELM algorithm for online recurrent time-series data. This experiment was done on predicting faults in a cloud environment. The parameter used here is NRMSE, which is Normalized Root Mean Squared Error. In fig.1.(a), we can see that the ELM algorithm has overall better performance has it has lower NRMSE. Fig.1 (a) Prediction error for the 40 days from source[1] Fig.1 (b) Prediction error when rapid changes of inputs occurred from source[1] Time comparison: Time taken by the ELM algorithm is around less than 10% of its comparison, along with having an overall lesser error. Fig.2 Comparison with (I)NRMSE, (II)MAPE, (III)Computational Time (in sec) from source[1] Comparison with Support Vector Machine and Random Forest Dataset is randomized and divided into three parts: full samples, half samples, and 1/4th samples. The full dataset consists of 65,535 samples, the half dataset includes 32,767 samples, and the 1/4th dataset consists of 18,383 samples. Accuracy, precision, and recall are used as evaluation metrics. Dataset is then split into 80% training and 20% testing. ELM performs better compared with SVM (Linear), SVM (RBF), and RF on full data samples, whereas SVM (RBF) indicates improved accuracy over RF and ELM on half data samples. SVM (Linear) outperforms other techniques on 1/4 data samples. Fig. 3 (a) Accuracy of SVM, RF and ELM The precision of ELM is better than all on the full data samples. On half data samples, the precision of SVM (Linear) is higher than that of SVM (RBF), ELM, and RF. While SVM has shown better performance than ELM and RF in the 1/4 dataset. Fig. 3 (b) The recall of ELM is better than all other algorithms for full samples and reduces as the sample size is reduced. This indicates that SVM performs better on a small dataset, while EML outperforms other approaches on large datasets. Fig. 3 (c ) Conclusion We can conclude that ELM has better performance than any other algorithm given a large amount of data and shortens the training time from days (spent by deep learning) to several minutes (by ELM) in MNIST OCR dataset, traffic sign recognition and 3D graphic application, etc. Along with that ELM even performs better when there are rapid changes in input data as seen in fig.1. The only drawback from using the ELM algorithm will be a little bit of larger testing time in some cases and the dataset should be large enough.
https://medium.com/datadriveninvestor/extreme-learning-machines-ef3b229d63c5
['Prasad Kumkar']
2020-08-04 18:40:16.805000+00:00
['Artificial Intelligence', 'Machine Learning', 'Data Science', 'Research', 'Extreme Learning Machine']
Emotion Detection from Hindi Text Corpus Using ULMFiT
Figure 1: Source- Impact Written by-Ankit Singh, Dhairya Patel, Kaustumbh Jaiswal Introduction Deep Learning has charged up the space of Image recognition and Speech processing for some time now. We are witnessing a similar trend in Natural Language Processing. Deep Learning for NLP was less impressive at first, but with the introduction of techniques like ULMFiT, ELMo,Transformers, BERTetc., it has become an impact driver, yielding state-of-the-art (SOTA) results for common NLP tasks. Named entity recognition (NER), part of speech (POS) tagging, Sentiment analysis, etc., are some of the problems where neural network models have outperformed traditional approaches. The progress in machine translation is perhaps the most remarkable amongst all. In this blog we will showcase a ULMFiT model and use it for Emotion Detection. ULMFiT is the technique of using transfer learning for text classification task. Let’s begin! Transfer Learning Transfer learning is the technique of using weights from a pre-trained deep neural network and tweaking them a bit to suit our application. In other words, it is applying the knowledge of an already trained model to a different but related problem. Figure 2: Source- EverythingAi It is suited to applications having a small dataset and also reduces computation time. What is ULMFiT? ULMFiT stands for Universal Language Model Fine-tuning for Text Classification, a technique introduced by Jeremy Howardand Sebastian Ruder. It is a technique to incorporate transfer learningin NLP tasks. USPs of ULMFiT is- Discriminative fine-tuning Slanted triangular learning rates Gradual unfreezing Discriminative Fine-Tuning Figure 3: Source- towardsdatascince Different layers of a neural network capture different types of information so they should be fine tuned to different extents. Instead of using the same learning rates for all layers of the model, discriminative fine-tuning allows us to tune each layer with different learning rates. Slanted Triangular Learning Rates Figure 4: Source- ULMFiT The model should quickly converge to a suitable region of the parameter space in the beginning of training and then later refine its parameters. Using a constant learning rate throughout training is not the best way to acheive this behaviour. Instead Slanted Triangular Learning Rates (STLR) linearly increases the learning rate at first and then linearly decays it. Gradual Unfreezing Gradual unfreezing is the concept of unfreezing the layers gradually which avoids catastrophic loss of knowledge possessed by the model. It first unfreezes the top layer and fine-tunes all the unfrozen layers for 1 epoch. It then unfreezes the next lower frozen layer and repeats until all the layers have been fine-tuned until convergence at the last iteration. For a detailed explanation on ULMFiT we strongly suggest you to go through this paper. Let’s Code! Installation For running the code explained in the subsequent sections make sure fastai version 0.7 is installed in your system. To install fastai follow the instructions given here. from fastai.text import * import html Getting Started We start up by creating different folders for classification and language models. PATH = Path('') # path to the data CLAS_PATH=Path('emotion_hindi_clas/') CLAS_PATH.mkdir(exist_ok=True) LM_PATH=Path('emotion_hindi_lm/') LM_PATH.mkdir(exist_ok=True) Dataset The dataset is created manually as there’s no pre-existing dataset for Hindi Emotion Detection. It comprises of 5 labels Angry, Happy, Neutral, Sad and Excited.
https://medium.com/saarthi-ai/emotion-detection-from-hindi-text-corpus-using-ulmfit-a151127581b7
[]
2019-03-04 11:56:51.332000+00:00
['Deep Learning', 'Sentiment Analysis', 'Text Classification', 'Developer', 'Naturallanguageprocessing']
Hot Pot Fund Weekly Report NO.18
Hi, all hot pot fans: Welcome to the 18th weekly report of the Hotpot Fund. We will continue to output the project’s operation and technology development progress for you, hoping to help you understand the first-hand market dynamics and better Defi investment decisions. NO.6.28-NO.7.4 Data Section(Monday) Operation Section The currency price rose slightly last week. Last Monday, the total present value of the fund was $55,4041.97. As of last Friday, the total present value of the USDT fund pool had risen to $565,913.39, with a yield of 8.17%, an increase of 2.27 points from the previous week. The advantages of users investing in DeFi projects in the hot pot fund will become more and more prominent over time. In the second half of the year, the hot pot fund will obtain higher value and income for everyone! Figure 丨 Uniswap V2 fund pool data last Friday Hotpot Fund V2 will be launched this month. Due to frequent incidents of recent DeFi projects being attacked by lightning loans, the new version deliberately introduces oracles to further upgrade and optimize the security of hotpot funds. In this way, everyone does not have to worry about the safety of their funds. Figure 丨 Interpretation of the main points of the Hot Pot V2 white paper In Hotpot Fund V2, the user’s capital utilization rate has been improved, and at the same time there is a chance to earn more income. At present, there are more and more DeFi investment enthusiasts joining through the community and media channels. Friends who have not joined yet hurry up and participate! Figure丨Hot Pot is active in mainstream media at home and abroad Technology sector The Hotpot Fund V2 version has entered the final testing stage and is expected to be launched in mid-July, so stay tuned! Figure丨Hot Pot V2 official website UI preview Hotpot Fund is the world’s first decentralized investment fund. If you want to know more details, please contact us through official channels. Official contact Twitter :hotpot.fund Medium :hotpot-fund Email:hpt@hotpot.fund Telegram:https://t.me/joinchat/4KRrzP38MWY2ZDU1
https://medium.com/@hotpot-fund/hot-pot-fund-weekly-report-no-18-26d93eea8be9
['Hotpot Fund']
2021-07-06 05:59:30.106000+00:00
['Uniswap', 'Defi']
Bill Gates’ new book, “How to fix a Climate Disaster,” list proposals that fall short of any real change. It shows how much we need to get the 1% out of our way.
Bill Gates’ new book, “How to fix a Climate Disaster,” list proposals that fall short of any real change. It shows how much we need to get the 1% out of our way. Unlimited Suns Mar 8·10 min read Dear Bill, After reading your new book on climate change, we feel a lot of confusion. You admit yourself that you aren’t the perfect messenger for the cause. You say, “I own big houses and I fly on private planes.” Let’s be clear here Bill, you spent 4.7 billion dollars last year, on purchasing Signature Aviation, a private jet company that makes millions of trips a year. You and your wife own 269,000 acres of land across nineteen states, more farmland than anyone in the nation and that does not count the thousands of acres you have privatized around the country with your friends. You vacation on super yachts. One of your houses is famously 66,000 square feet. When you say, “I own big houses,” is not cute or coy, it is disingenuous. Of all the statistics you mention in your book, there is one that you leave out. It’s a common statistic to anyone who cares about climate change. It goes, The richest one percent of the world’s population is responsible for fifteen percent of the world’s carbon emissions. According to Oxfam, as recently as September 2020, “This is more than twice as much carbon pollution as the 3.1 billion people who make up the poorest half of the planet.” This begs a pressing question, it’s a math question so you might know this one. How are we supposed to get to zero if you and your buddies who are responsible for 15 % of all carbon emissions don’t make drastic changes? Come on Bill. If you really believe what you say you do about climate change, your carbon footprint should not be 10,000 times the average person. In the book, you explain how that you think you can pay your way out of guilt, but as you say in the book, no amount of money will be able to compensate for the future lives lost by not taking action now. Your commitment to transition your private jets to have more sustainable jet fuel is not only out of touch, a slap in the face, (it’s hard to believe you said that, seriously) but it’s symptomatic of bigger problems that you and your billionaire friends represent and impose on the rest of us. Let’s review some of them here. As you mention, there have been activists pressuring you to move your billions of dollars of investments from fossil fuel to better causes which you did not do until 2019. In the book, you say it’s because you would feel terrible if you benefited from oil stock going up but until 2019, it had been going up and you had been benefiting from it. You expect us to believe that right as oil prices plunge to despairing depths and are no longer a money maker, you, at the same time, had some Dickinsion dark night of the soul over your oil investments? Come on Bill. Do not scapegoat climate change for what was clearly a coincidence. It’s not a great look to assume we are stupid. In the book, you talk about the praise you received for giving 445 million dollars to Corona Virus relief efforts. Your wealth also grew by 20 billion dollars during the pandemic. There is only one thing more magical than fertilizer Bill, your philanthropy money. If your job is to give away money, how do you have more of it than ever? In the book, you throw around a lot of numbers that are supposed to impress us. Have you heard the story of the widow’s mite? It’s a biblical tale that makes all your donations annoying and moot but you probably aren’t the type to give credence to the everyman’s Christian sentiments. Like you, we like numbers. Us, Americans together, donated 449 billion in 2019. Unlike you, what we give requires an actual sacrifice. We don’t get leverage back, or public praise, or more returns than we gave in the first place. We get nothing. You have ill gotten gains, monopoly money, literally. The poor give more percentage-wise than you and your buddies do with your million dollar pennies that aren’t so shiny and really aren’t that much. In the book, you say, “Every four to eight years, a new administration arrives in Washington with its own energy priorities..but it takes a toll on researchers who depend on grant money and entrepreneurs who rely on tax incentives…” While we can only imagine how much our democracy is an impediment to your tax incentives, maybe it is a glimpse of how those 800,000 school children felt back when you scrapped funding for small schools in 2009. Remember when you were so certain it would be the answer to what is wrong with U.S. Education but it wasn’t? You used government money to experiment on children in over 2,606 schools. Many students on track to graduate dropped out when you dropped funding. You drove good teachers out of struggling schools and many of those schools never recovered. The way you feel about using government money for your experiments, is the way any charity feels that is dependent on your money for their cause. It’s one of the problems with charity right? Except for your income, it isn’t sustainable. You and Melinda have single-handedly changed our education system for the worse. You have lobbied, paid for and advocated for charter schools where our teachers cannot unionize, teachers get paid 10–15% less, schools are not accountable to the state because they are private and students do not perform better overall. And yet, you and Melinda refuse to acknowledge your privilege and power over our education system. In the book you say, “I’m funding a project that involves building a computer model of all the power grids covering the United States.” Why do you have access to our power grids? You aren’t a voted official. You aren’t a trusted, neutral, public entity. There is a problem with your access to information and data and people that you brag about so breezily in this book. Remember what happened in 2013 when you got together with Rupert Murdoch and used 100 million dollars for a database that stored the personal information of millions of school children? Remember when you got caught storing their addresses, social security numbers, test scores, any documented learning challenges and extra curricular activities and then sold that information to companies that make educational materials, remember? This would be fine with parental consent, but you didn’t acquire it. There’s another incident which you famously did not acquire a guardian’s consent. All those conspiracy theories are so crazy right? Let’s set this record straight, it’s important that you know we don’t forget things. You teamed up with a Non-profit in Seattle called PATH short for Program for Appropriate Technology in Health. PATH went to India with an experimental HPV vaccine which they injected into around 16,000–23,000 young women, ages nine to fifteen. (The numbers vary greatly because of all that mishandled paperwork.) Months later, scores of girls became sick and hospitalized, seven of them died but because there wasn’t a system for keeping track of adverse affects of the vaccine, it was hard to tell whether these girls died of natural causes or because of vaccine related side affects. India’s federal government got involved and called the vaccine trials, “shockingly unethical” they found that thousands of consent forms had been forged. You don’t take accountability for things. Even when your mistakes cost lives or harm. In Africa, you tried to eradicate polio with an oral medication that contained a live strain of the virus which, in unsanitary conditions can mutate and cause a child to contract polio. More children died of the vaccine than died of wild polio in one year because of your negligence. You are not a doctor. You do not have a degree in medicine and you are are buying too much power over our health. In the book, you are very proud of all the work you’ve done in India and Africa. You say, “The plight of poor farmers- as well as the impact that climate change will have on them is something I have learned a lot about over the past two decades through my work on global poverty. It’s also a passion of mine because I get to geek out on the fascinating science behind plant breeding.” Let’s talk about your love for GMOs and Monsanto at the expense of small farmers. In the book you talk about the necessity of higher yields for more income as a way to rescue small farmers from poverty. You work with companies like Monsanto, though you never mention their name in the book, but it’s a company that you also have shareholder stake in. Of course, you are more than happy to go to poor places and give Monsato seeds away locking farmers into a vicious cycle of paying future royalties and putting them in a position to have to buy new seeds every year. Farms have to use copious amounts of fertilizer and invest in a lot of heavy machinery to grow those GMO seeds. In 2020, there was a study, this is critical Bill, False Promises for a Green Revolution in Africa, which said that poverty has increased by 30% in the 18 countries you and your wife have been. There’s another biblical phrase that comes to mind, ‘By their fruits ye shall know them,’ what have you done, Bill? It is common knowledge that your statistics on poverty are misleading. Hundreds of thousands of farmers have killed themselves in India because of how your GMO seeds have flooded the market at incredibly high prices. Just this week, a faith based group in Africa wrote you a letter asking you to change your tactics on their continent. They want to plant their own, local seeds. They want to continue a tradition of farming that has sustained them for years. Are you going to listen to them? Or, like every other white, imperialistic, paternalistic colonizer, are you going to keep pushing your materials, your technology, your way, for your money? We will be watching this closely Bill, no one holds you accountable, yet. In your book you say something so jarring and hollow, it has to be read a couple of times to sink in. You say, “So if you want a measuring stick for which countries are making progress on climate change and which ones aren’t, don’t simply look for the ones reducing their emissions, look for the ones setting themselves up to get zero. Their emissions might not be changing much now but they deserve credit for getting on the right path.” While this statement might hold water among your Tedhead friends and your DAVOs wannabe-glam-people who are used to making empty promises for praise, in the real world, we get credit after achieving something. There is problem when people with power think their words are actions. You’re just a genius Bill. You aren’t God. In your book, you tell Joe Biden to quintuple clean energy and climate related research and development. You suggest going from a budget of 22 billion a year to 37 billion a year and here is where you again get the order of things wrong. You say, “The government needs to make bigger bets on high risk, high reward R&D projects. This is especially true of scientific enterprises that remain too risky for the private sector to pursue.” You sound like Oliver Twist, “Please sir, can I have some more?” Have you seen the rocket ships explode at Space X? With their own money and everything! The government’s job is not to take risks. The governments job is to do what works for the people. Your job is to take risks. Your job is to make something work and sell it to the government if it does work. I guess this poses a problem for you though. Most of what you come up with hasn’t worked. You insist that, “Without demand for motivation, inventors and policy makers won’t have any incentive to push out new ideas.” Bill, policy makers should be motivated by their constituents, a set of values they were voted in for. And we are talking about capitalism in America. You mean that if you aren’t incentivized with money, then you can’t push out new ideas? Because entrepreneurs and inventors, we run on a compulsive inner spark which is a very different light from the glint of a golden coin. You clearly don’t know which is which anymore. You should not have written a book about climate change Bill. You should not have felt comfortable enough to write a book on climate change, let alone fly around the world in your little jet to promote it and charge a personal profit for it! You are way too comfortable. Maybe it’s those cashmere sweaters (which don’t fool us, you are nothing like Mr. Rogers.) When your best ideas for everyday Americans fixing climate change is to buy electric vehicles (how can you say nothing of bikes and biking infrastructure?! Obvi!) Or you say we should buy synthetic meat which you talk have stock in, your ideas are not good enough. We don’t want anymore of your stale experiments; we don’t want anymore of your failed attempts to fix things and leave them more broken than before, we don’t even want your charity money Bill. We want you and your friends to pay their fair share of taxes. As you go around the country touting your hypocritical crap you should be very careful. This week you’ve been also speaking out against raising minimum wage which some might consider the rhetoric of class warfare. Enjoy your cake. While you have it. Sincerely, Blue collar workers who are sick of your shit.
https://medium.com/@unlimitedsuns/dear-bill-e43ec57d2a2e
['Unlimited Suns']
2021-03-08 16:50:06+00:00
['Climate Change', 'Book Review', 'Bill Gates', 'Philanthropy', 'Corruption']
What OnlyFans Can Teach Us about the Evolving Role of Pleasure in Work
If I were a betting person then I would guess that the most popular ice breaker question in any room of adults is, “So, what do you do?” This question has become a shorthand for “Who are you?” that people use to determine whether or not the other person is interesting enough to hold a conversation with. I’m not saying that this is right, I’m just saying that this is true. So much of our identity as an adult has become wrapped up in how we make a living. We are, in some sense, where we work and who we work for. At least this was true in the “B.C.” times, the “before Coronavirus” times. These days it is, or it should be, rare to find a physical room full of strangers of any age holding a conversation about anything, let alone work. After all, where we work, who we work for, and whether or not we’re working at all has changed so much this year. It is an understatement to say that the Coronavirus (COVID-19) pandemic has changed the nature of work. A Pew Research Center survey released in September found that a quarter of U.S. adults said that either they or someone in their household was laid off or lost their job because of the Coronavirus outbreak. The survey also found that many workers who didn’t lose their jobs still had to reduce their hours or take a pay cut due to the economic fallout from the pandemic. COVID-19 has forced people to either consider who they are outside of their work or to find work that better reflects who they are. The social distancing measures brought on by the pandemic have inspired more workers of all backgrounds to supplement their income by becoming content creators who make a living by being themselves. The rising popularity of OnlyFans in 2020 [1] is one of the most prominent examples of how the nature of work has evolved in tandem with social media. As a scholar who examines the influencer marketing industry in the United States, it has been amazing to witness the exponential growth of OnlyFans this year primarily because it is a platform where the line between “influencer” and “sex worker” is blurred. This doesn’t surprise me because as sociologist Angela Jones explains in her book Camming: Money, Power, and Pleasure in the Sex Work Industry (released in February 2020), sex workers are motivated to perform labor for a myriad of reasons, many of which are not just about money. I have learned that this goes for influencers too. In pairing these materials — OnlyFans and Camming — I hope to highlight the overlooked role of pleasure in work, especially during these times of increased precarity. For the uninitiated, OnlyFans is a subscription-based content-sharing website where “Creators” (Users who upload content to be viewed by other Users) charge “Fans” (Users who follow Creators and view the Creator’s User-generated content) to see their photo and video content through paid subscriptions of generally $5 to $50 a month. Imagine, if you will, that Patreon and Instagram, had an offspring that was only accessible on a desktop then you would have a partial understanding of OnlyFans. I know that this reads like a description of a pretty “vanilla” content subscription service. That’s because OnlyFans is a pretty vanilla content subscription service. This is by design. I mean this quite literally. OnlyFans has a social media-oriented interface that mirrors the clean lines and minimalism of social media platforms like Instagram and YouTube. Marketing-wise, the OnlyFans company blog, and social media pages promote mainstream celebrities and social media influencers who have accounts on the website, focusing primarily on lifestyle influencers in the fashion, fitness, beauty, and cooking realms while posting promotional videos similar in aesthetic feel to those put out by more “family-friendly” social media platforms like YouTube and TikTok. Its differentiator, and perhaps its unpromoted value proposition, is its loose terms of service guidelines which makes it easy for people to turn it into a sex work platform. Other social media sites like Instagram and TikTok are open to teens 13-years of age or older and specifically restrict pornographic material on their platforms. The primary restriction that OnlyFans has is that you have to be at least 18-years of age or older to make an account. The site’s design along with its terms of service may arguably help it to both humanize and normalize sex work by creating an environment where “influencers,” “celebrities,” “sex workers,” and anyone else who exists in between can be seen in the same space as just “Creator.” Sex work becomes destigmatized when the work is happening on a platform like OnlyFans that doesn’t market itself as a sex site. However, just because OnlyFans doesn’t promote itself as a platform for sex workers doesn’t mean that it isn’t benefitting from the selling of, or even the teasing of, sex. Take, for example, the case of Michael B. Jordan, People Magazine’s newly crowned “Sexiest Man Alive.” Jordan recently capitalized on OnlyFans’ association with sex when he announced on Jimmy Kimmel Live! that he plans to launch an OnlyFans account to raise money for a barber school. “Got an OnlyFans coming soon — eating fruit, all types of crazy stuff. It’s going to get wild,” he said to Jimmy Kimmel. Although People Magazine’s “Sexiest Man Alive’’ may never publicly promote a Pornhub account, it is socially acceptable for him to promote an OnlyFans account. Jordan used the site’s association with sex to his advantage even though it is likely that the content he will share on the platform will be PG-13 at best. His team understands that while sex sells, selling too much sex may tarnish Jordan’s generally “All-American Guy” image. He is the kind of Creator that OnlyFans relishes in promoting publicly. He’s sexy, sure, but in a “safe-for-work” (SFW) way. In her book Camming, Angela Jones focuses on the erotic webcam industry also known as “camming.” Camming is a genre of indirect sex work that first emerged in 1996 in which cam models sell interactive computer-mediated sex online. This description of “camming” doesn’t veer too far from the sex work that is performed on OnlyFans, but unlike the platforms such as Chaturbate and Streamate that Jones examined in her study of the erotic webcam industry, OnlyFans, again, does not market itself as a sex site. Jones doesn’t mention OnlyFans in her book, but the similarities between the cam models she follows and the content creators that I follow in my own research are notable. OnlyFans represents the reincarnation of sex work in the gig economy age. Cam models and content creators are both independent contractors who get paid to be themselves, or at least play a version of themselves, online. Like all gig workers, their wages can be inconsistent and precarious due to the business structure of the platforms they work on. Gig work is not necessarily new. As Jones notes in Camming, strippers are also independent contractors who have to pay “house fees” to work. Creators on OnlyFans pay “house fees” too. Users can create accounts for free, but OnlyFans keeps 20 percent of their earnings as a fee when they start to make money on the platform. There are many reasons that people turn to gig work. In addition to the promise of better wages are the promises of more flexibility and greater autonomy. Jones explains that these are also among the reasons that motivate people to perform Internet-based sex work. She also adds the following to the list: safer working conditions, a decline in risk exposure, and a greater potential to experience various pleasures. Jones describes camming as a form of legitimate labor that “monetizes human desires for sex, intimacy, and pleasure.” It is the aspect of “pleasure” that serves as her intervention to the scholarship of sex work and the broader discourse on labor. She argues that the literature on sex work does not highlight the ways that an online environment may foster a space where the workers themselves have a greater potential to experience pleasure. “Scholars have focused too much on the regulation of sex and have missed the point that the underlying motivation for sexual regulation is a fundamental desire on the part of societies to control pleasure,” Jones writes. According to her analysis of the camming industry, “clients” are not only paying for their own pleasure, but they are also paying to watch the cam model experience pleasure. To be clear, “pleasure” in this context is not always sexual in nature. To clients, webcam models are simply real people broadcasting themselves on the Internet. Jones concludes that pleasure is not only an initial motivation for camming it is also often the reason people stay in the industry. While OnlyFans is not a traditional camming platform, it does share elements with the kinds of platforms Jones studied. In fact, I argue that pleasure also motivates and mediates the social interactions between “Creators” and “Fans” on OnlyFans even in the more general instances where the “Creator” is not a producer of NSFW material. By selling intimacy, not just sex (if they choose to sell sex at all), Creators deliver or perform what Jones refers to as an “embodied authenticity” for their supporters. In other words, they make their fans feel like they are having an authentic encounter even if the encounter is the product of economic exchange. In the kinds of computer-mediated interactions that take place on camming sites and on OnlyFans, all parties are made to feel safe and more willing to be themselves because of the physical and psychological barriers of their screens. Jones’ study of the camming industry shows how pleasure can be experienced as a fundamental part of labor. In more traditional workplaces within a capitalist economy, the pleasure of the worker is sacrificed to drive profits for the employer. However, in the online world of personality-driven work pleasure becomes a social experience that benefits all. And, because this is still capitalism we’re talking about, more pleasure leads to more profits for the independent contractor and the platforms they work on. Given the flexible working conditions of online sex work, erotic labor may now appeal to people across social classes who previously were unwilling or unable to perform sex work that occurs offline because of reasons related to their identity or embodiment. Additionally, OnlyFans’ social media-inspired interface and marketing efforts are contributing to the normalization of sex work. While the potential for pleasure prevails the gig economy is no utopia. As revealed by Jones’ study of the camming industry the digital platforms sex workers perform on can reinforce existing systems of oppression by conditioning the precarious wages that they as independent contractors can earn. In one chapter of Camming, Jones uses the case of a platform called MyFreeCams to call out the intricate ways that race, class, and gender-based inequalities are perpetuated in the camming industry. Her statistical analysis of MyFreeCams found that Black women, Latina women, and women from outside of the United States, Canada, and the United Kingdom were significantly less likely to be successful on the site. Jones concludes in this chapter that White supremacy is embedded within the camming field and that race should never be separated from analyses of sexuality. Additionally, the survey from the Pew Research Center that I previously mentioned revealed that Black and Hispanic Americans were the most likely to have faced deep financial hardship as a result of the coronavirus outbreak. This indicates that race should never be separated from general analyses of work as well. Since the erotic webcam industry is arguably a predecessor (and, a still competitive alternative) to the OnlyFans business model it makes sense to insert Jones’ timely book Camming into the current conversation about OnlyFans as a space for a newer form of precarious online work, “influencing.” If anything, this pairing shows that many of the obstacles workers experience offline such as discrimination may also be replicated in online work environments. Although online platforms like OnlyFans may help to create new opportunities for workers to become entrepreneurs and to craft for themselves a more appealing labor environment that foregrounds pleasure as a social experience it does not resolve all of the problems that they may have encountered in the physical world as it relates to finding and keeping a job.
https://anuliwashere.medium.com/what-onlyfans-can-teach-us-about-the-evolving-role-of-pleasure-in-work-1f4e36df2373
['Anuli Akanegbu']
2020-12-18 22:48:37.344000+00:00
['Work', 'Gig Economy', 'Sex Work', 'Pleasure', 'Onlyfans']
[Word]以為不用學選。剪。貼
Learn more. Medium is an open platform where 170 million readers come to find insightful and dynamic thinking. Here, expert and undiscovered voices alike dive into the heart of any topic and bring new ideas to the surface. Learn more Make Medium yours. Follow the writers, publications, and topics that matter to you, and you’ll see them on your homepage and in your inbox. Explore
https://medium.com/%E6%97%A5%E5%B8%B8%E5%85%AC%E5%BC%8F/word-%E4%BB%A5%E7%82%BA%E4%B8%8D%E7%94%A8%E5%AD%B8%E9%81%B8-%E5%89%AA-%E8%B2%BC-30044848c476
[]
2020-12-26 12:38:53.162000+00:00
['School', 'Student', 'Office', 'Word', 'Work']