id
int64 1.57k
21k
| project_link
stringlengths 30
96
| project_description
stringlengths 1
547k
⌀ |
---|---|---|
10,008 | https://devpost.com/software/how-well-do-you-know-your-world | Title Screen
Game Play, Selecting Countries
Explore Mode, Studying for the Next Round
GIF
Perfect Score, Fireworks!
Inspiration
Geography is a foundational knowledge that helps build a better understanding of our diverse world. Too often we forget where a country is when we aren't exposed to it frequently. And when we forget where a country is, we lose its context. A country's neighbors are culturally related, after all. Games can teach geography through play. This project experiments with this kind of play in Augmented Reality.
What it does
In this game you're given a timed challenge to find countries in the world. Planet Earth hovers above the floor in front of you. As you move around it, aiming your camera at its surface, countries light up below a cursor. You must point at each country you're challenged to identify before time runs out. If you get them all, you're treated to fireworks. If you miss any, don't worry. After a round, you're given some free time to explore and study the globe before you tap to play again!
How I built it
The Earth that serves as your gameboard is rendered with real NASA data, a day and night side, reflective oceans, drifting clouds, and an atmosphere.
I started off by watching the
tutorial by Blender Guru
on making a realistic planet Earth in Blender. It used a different technology, but there were principles I could use when building my Earth in Spark AR.
For Earth's texture, I used imagery from NASA's
Visible Earth
and
Science Visualization Studio
projects. They provide high resolution images of the Earth in the correct projection, equirectangular. I downloaded images for
daytime
,
nighttime
,
clouds
,
topography
, and a black and white
land/sea mask
.
For the country data, I used Michael Bostock's
world-atlas
. I used D3 to generate both a geoJSON and a color coded PNG of this map of countries. With d3 and a geoJSON map imported into Spark AR, I was able to raycast from the player's camera to the globa and use
geoContains
calls to see whether they'd hit the correct country.
Challenges I ran into
Country Selection Highlighting
I wanted to challenge the player to find countries on a bare planet without all their borders visible. Instead I wanted to show the player just one country at a time as they scrubbed their cursor over the surface of the globe. I didn't have 3D geometry for all of the borders of the countries, and those I found online had too many polygons. I decided to try using a color coded texture. I used D3 to generate a texture in which every country had a unique color. I then figured out a clever trick to mask out all but a single country from this map. Let's say I wanted to highlight just the country colored red, (1,0,0). I put the entire map through a shader that finds each fragment color's distance from red. All countries will have a non-zero distance except the red one. Now if I subtract that value from 1, all country's values are less than one, other than the red one. Finally, I can raise that value to a high exponent and all the other country's values will drop near zero, except the red one. From there I just alter and have my highlight.
Pausing a Signal
As a player is highlighting countries, searching for the correct one, the camera orientation signal is ultimately driving the position of the cursor and the country that is highlighted. When the player hits the right country, I wanted to pause the selection of that country to allow time for the correct answer to register. This was challenging because I knew of no way to pause a signal. Luckily, I came up with a solution using the "Offset" patch. Since this patch measures the difference between the current signal and the signal's value the last time the patch was reset, I was able to use this difference to calculate the signal from the past. I suspect I could have used signal histories or signal recorders to achieve the same effect.
Accomplishments that I'm proud of
I'm really happy with how the Earth came out. It has about as much detail, animation, and realism as I could pack into an AR application. In particular, I'm proud of solving the problems of lighting and atmosphere.
To light the Earth realistically, I used NASA's separate day and night textures. I supply the position of the "Sun" directional light in the scene as input to the Earth's surface shader. I then use it to transition from day to night in a realistic way.
Another feature that worked well was the atmosphere. I created a Fresnel shader out of patches and use it to create a blue haze that hugs the edges of the earth and thickens as you look through the atmosphere at a lower angle.
What I learned
Through completing this project I learned more about:
Advanced 3D rendering techniques in Spark AR
Algorithms and patterns in Patches and Reactive programming
Organization and re-use with Groups and Patch Assets
Bridging Patches and Scripts
Incorporating 3rd party libraries like d3.js
Animating UVs and 2D UI elements
Particle Systems
Audio
Effective debugging in Spark AR
What's next?
There are some countries that are too small to effectively select in AR. For now, you just aren't quized on the smallest countries. I'd like to correct that, as the small countries are very important.
There are some ways of optimizing textures I'd like to try. Some of the imagery I'm using is black and white, and could be combined into the RGBA channels of a single image.
Built With
augmented-reality
d3.js
spark-ar
Try it out
www.instagram.com
www.facebook.com
github.com
www.instagram.com |
10,008 | https://devpost.com/software/augmented-business-card | Creating the 3d Model
3D model in Spark AR
3D Model When tracked on the business card
Inspiration
I was in charge of designing the business card and creating various marketing materials for a local crafting company. I soon realized that it would be much easier to spread the word about the company's products if clients could have a glimpse at what they looked like before purchasing it. Especially in the case of the pandemic, people are less likely to go out and would rather purchase their products online, and sometimes images just aren't enough. In order to add more interactivity and help small business owners in this time of need, I created the augmented reality business card.
What it does
When the camera detects the business card, you are able to see the jar, which is identical to one of the company's products.
How I built it
I created the model in a 3d modeling software and the business card in Adobe Illustrator. I imported both into Spark AR and attached a target tracker.
Challenges I ran into
I tried to make multiple UI components to link a second 3d model object so that the client could see various other products which the company sells. Unfortunately wasn't able to figure it out, so I just reverted back to this simple target tracker. I also had troubles getting the jar texture to look like the glass texture in the original 3d model I created. Luckily I figured out that the ORM texture can create a similar glassy effect.
Accomplishments that I'm proud of
I am proud to have just created a simple AR project, which I can apply directed onto an existing business to help expand their online presence.
What I learned
I learned that there is a lot of work that can be done in augmented reality. I'm looking forward to learning about what else I can do in the future with some practice.
What's next for Augmented Business card
Next up, I will try to implement the other 3D models so the client can toggle through other products from the company.
Built With
blender
dimension
Try it out
www.instagram.com
github.com |
10,008 | https://devpost.com/software/synth-wheel | I wanted to create a musical visual effect in Spark AR Studio using audio spectrum analyzer. For musical content stories in Instagram, it'll add interesting visual to encourage viewers to turn on the sound.
So I experimented create effect using code generated musical pitch tone from sampled sound to drive animated visuals of Synth Bot.
Some interactivity are:
*Tap and Hold platform button will change the number of circling rod to alter pace of drum beat
*Tap and Hold while dragging the bot will alter tone pitch
*Tap on the bot to play randomized scripted synth tone progression.
In latest version of Spark AR Studio, I used Audio analysis to drive various interactive visual output base on pitch frequency of sound clip. The procedural driven tone based on note progression dictionary of semi tone number in the Javascript.
I Imported 3D Model of Musical Bot I created using Blender and textured in Substance Painter.
I then modeled the platform and vortex objects by rearranged UV mapping in the model to apply animated shader effect.
With this project I learned new SparkAR API changes from the previous version. Also tips on creating procedural generated tones from sample sound file.
Some obstacles I experienced during development. I find organizing components in blocks and limited support for interconnect using Audio datatypes in Patches. Also encountered corrupted project files while playing with the latest version. Same project won’t load after restarting the editor. Gladly I recovered after several restarts and rebuild from old stable check ins.
After the contest, I plan to modify the project for placing character as head decor for front camera selfie. Control the animation with microphone and facial expressions. Perhaps creating new AR instrument or musical training game based on this project. Possibility are endless.
Built With
blender
javascript
sparkar
substancepainter
Try it out
www.instagram.com |
10,008 | https://devpost.com/software/dance-bot | Dance Bot
Inspiration
Our team is a big fan of dancing. We loved all the new trends that were emerging on the internet from viral tik tok dance challenges to professionals sharing their amazing choreography. We wanted to create a World Effect that would encourage us to dance a lil more with our friends and capture memories.
Because of COVID-19 we understand that we have to remain physically distant, but we know that this World AR effect can work from the distance while we dance with our friends and loved ones.
Dancing and physically moving is a great way to improve health, especially while we need to remain six feet apart.
Our filter supports social distancing efforts to flatten the COVID-19 curve, while promoting health and exercise in a fun and easy manner.
What it does
This effect records people dancing along to our Dance Bot. The recorded video can then be shared online and encourage others to try our Dance Bot challenge. There are a variety of different dances in our filter so users can have fun dancing to each one.
Our filter requires a minimum of two people to be used. One person is holding the camera to record while they read out the dance instructions to the person being filmed while dancing. This allows people to connect and engage with each other while being active and healthy. We encourage users who are not quarantined together to dance with Dance Bot six feet apart.
How we built it
We used Spark AR to create the filter and Blender to create and animate Dance Bot. Spark AR is great as it allows an easy and accessible way for users to be able to try our filter.
Challenges we ran into
This was the first time either of us had used Spark AR to this capacity. We had to learn how filters were made from new concepts like textures and materials. The time duration of this hackathon gave us time to learn so we could build Dance Bot.
Some challenges we ran into were learning how to use patches, exporting Blender elements (especially to meet file size requirements!), and learning how to design a user's experience with AR.
Accomplishments that we're proud of
We're proud of being able to build Dance Bot. We're proud of being able to share it with our friends and now the world on Instagram. We hope people have fun dancing along with our Dance Bot!
What we learned
We learned more about 3D modeling and animation concepts like rigging and textures. We learned more about how we could use Spark AR to build filters, and we're excited to continue building and sharing filters. Spark AR is an amazing tool, and it's really cool to see how it can be used to create AR effects that can be shared and accessed by anyone with a phone and an Instagram account.
What's next for Dance Bot
We'd love to add more dances and interactions. We're thinking of adding dances based on different cultures and regions, as well as make the dances more inclusive by taking into account different physical abilities (ie users in wheelchairs).
We hope to see people engaging and dancing with Dance Bot and would love to go viral, #DanceBotChallenge anyone?
Built With
blender
figma
sparkar
Try it out
www.instagram.com |
10,008 | https://devpost.com/software/be-kind-c3sn0z | Logo
In Street
Inspiration
Whenever I am unsure of how to behave, no matter the situation, I confront it with love and kindness. I am not always successful but I try. I think the world needs more kindness and compassion. To recognize that we are more similar than different and all desire the same level of safety in our daily lives. I wanted to create a filter that reminded people to be kind, towards others and towards themselves.
What it does
This is a World Effect with 2D and 3D components. The text will pulse towards the user and upon the tap, the side hearts will pulse to sounds.
How I built it
The 3D letters and heart were modified in Blender.
The 2D writing was made in Photoshop.
Everything was build within the Spark AR editor.
Challenges I ran into
Originally wanted to use emitters to react to the audio analyzer but it is apparently not possible to modify the scale of particles, while emitted, at the moment. Then, I wanted to use hand tracking to emit hearts form the palm of user's hand facing the 3D text, but Hand Tracking & Select are not compatible with Instagram.
Accomplishments that I'm proud of
This is my fourth filter to be published and my first Hackathon. I am proud to have pushed myself through to this hackathon. I am glad to have tested the audio analyzer and Energy meter - both new tools to me since my last filter. Can't wait to have Hand Tracking. I enjoy creating filters and this hackathon pushed me to dive in again!
What I learned
I would have liked to spend more time on creating the 3D assets and an animation that would have enhanced the filter. I also learned that sometimes simple is best, not all filters emitters and things flying all around (as I usually like to have), sometimes the words can simply resonate. The Spark AR Creators community on Facebook is my guide for all the filters I created. People are extremely helpful and supportive and I have learned a lot from them.
What's next for Be Kind
I'd love to update it with hand tracking so that people can put out their hand and have hearts flying out of their palm, giving love to the world. Whenever you give, you get it back, and you're more inclined to give again and again.
Built With
blender
photoshop
sparkar
Try it out
www.instagram.com |
10,008 | https://devpost.com/software/birthday-cake-bakar | Make a customized birthday cake celebration with AR
Inspiration
What’s a good way to be there for someone’s special occasion without actually being there? By sending them an AR cake customized just for them of course. Especially during this Covid-19 pandemic, we’re all so far away from some of our closest people. We figured it would be amazing to have something that would bind us all together, and to light up the face of your loved one. A customized AR cake seemed like the perfect way to do so with the feature of adding media from your gallery making it an amazing, sweet and thoughtful way to celebrate.
"I sure can’t wait to have someone send me my AR cake!"
What it does
Birthday Cake BakAR is a World Effect AR filter that allows you to choose a birthday cake with the bonus feature of adding a personal media from your gallery - something close to your heart. This is a personalized message to a loved one for their birthday or possibly even for other occasions that would be perfect with a cake. Melt your loved ones’ heart by lighting a candle and choosing a lovely AR cake for them to have. Once you light off the candle, a happy birthday song along with some balloons and confetti will fly over.
How I built it
The filter was built in Spark AR Studio, we used javascript to create the UI Picker with a selection of cake options, and used the Patch editor to control the screen interaction for cake size, cake position, tap to light up candle, and tap the candle flame to light off the candle.
To make the celebration effect more personalized, we added the gallery texture and gallery picker for user to upload a media from their camera gallery.
Celebration must comes with a surprised, so we used Patch editor to create animation transition for the balloons and confetti flying over in the environment along with a Happy Birthday song.
All 3D models were used from Spark AR library.
Challenges I ran into
Setting up the world effect confetti and balloons to fly to the correct angle for the camera view was challenging, lots of trial and error to make it working.
Accomplishments that I'm proud of
Learned Spark AR Studio to create AR effects in a very short period of time, consider no one in our team has experience working with Spark AR before, and be able to bring our vision to a working AR filter is one thing we are proud of.
What I learned
We learned a lot about creating AR effects In Spark AR Studio.
What's next for Birthday Cake BakAR!
We want to allow user not only pick their cake options but also allow them to decorating the cake by adding layers on top of the cake.
Built With
javascript
photoshop
spark-ar
Try it out
www.instagram.com
github.com |
10,008 | https://devpost.com/software/mesa-viejuna-6j0qng | Inspiration
Those who were born in the 80s know about what is was to be gamer in the 90s. Let's go back a few decades in time!
What it does
Mesa Viejuna or Retro Table, shows us thanks to the Augmented Reality an immersive experience of what was a classic gamer table. Old CRT Tube TV, NES and joystick were the only needed things to enjoy ourself!
How I built it
Thanks to many years involved in 3d games and render techniques, this filters shows a nice going effect. Together with pre-render wrapped texture and lights emissions from different sources, Mesa Viejuna provides a very close experience to reality.
Challenges I ran into
Augmented Reality is amazing technology but it also needs a little bit of art to make things come to life. The right position of lights, textures compatible with the environment, all those were challenges that only an artist could understand.
Accomplishments that I'm proud of
Very clean work, simple, immersive work.
What I learned
Spark AR is a very complete system and it's designer is quite simple. It has a very complete library with effect and processing.
What's next for Mesa Viejuna
The original idea was to be able to play a little bit with the object in the scene. The next step is to take for instance the Joystick and start to move Mario around.
SPECIAL THANKS TO NINTENDO, the images are art representation of the original copyright for Nintendo NES and Mario Bros
Built With
art
blender
particle
texture
Try it out
www.instagram.com |
10,008 | https://devpost.com/software/undercover-escape-from-police | I escape from the police!
WHAT?? I am the POLICE!! ahahaha
I chase myself! ahaha
Me into a Lamborghini!
Logo for my filter (edited in Photoshop)
ITALIANO
Ho iniziato a usare e scoprire il mondo di SparkAR più o meno 1 anno fa.
Non pensavo di arrivare fino a questo punto e aquisire delle competenze elevate con questo software.
L'uso del Patch Editor è stata una delle sfide più grandi perchè all'inizio non capivo come si potesse usare e infatti trovavo molta difficoltà, ma poi col tempo ho imparato ad usarlo bene fino a diventarne dipendente . Oltre a questo filtro ne ho fatti molti molti molti altri e spero che andando avanti riscuotano sempre più successo sia quelli che vecchi che quelli nuovi che inventerò e farò.
ENGLISH
Using the Patch Editor was one of the biggest challenges because at the beginning I did not understand how it could be used and in fact I found it very difficult, but then over time I learned to use it well until I became dependent on it. In addition to this filter I have made many many many others and I hope that going forward, both the old and the new ones that I will invent and make will be more and more successful.
Built With
patch
sparkar
Try it out
www.instagram.com |
10,008 | https://devpost.com/software/maze-ar | Inspiration
I have always had very little orientation and attention span and the mazes help to solve this problem. I wanted to make this game to help other people, but adapting it to augmented reality. As for aesthetics, I was inspired by the early 90s retro games such as Doom (1993), Catacomb-3D (1991), Blake Stone: Aliens of Gold (1993) in which you totally immersed yourself in the first person on stage.
What it does
The game consists of a labyrinth in which you will have to enter, to do this placed on a fairly large flat surface and set it to your size, then tap on the door and you will be able to enter. At the entrance you will find a double screen click on it and you will discover the object you have to search for. Go around the maze until you find this object. Click on it to find out which one is correct. This game helps develop attention, concentration because you will have to remember where you have already gone so as not to return to the same place. It also benefits to develop spatial perception and orientation. You can play alone or with friends with a device, so it will be more fun!
How I built it
Build it with Spark AR, I wanted to show the possibilities of this program without having to program and add java. Everything is done using the patch editor. I made 3d objects with Cinema 4D.
Challenges I ran into
into I ran into problems so you could move through the maze in a fluid way. Also with the size of the file, that's why I decided to do it all with polygonal shapes and not add textures, just play with light and colors.
Accomplishments that I'm proud of
I am proud of having no programming knowledge having managed to make a game by myself, for use by all audiences, and that helps develop the skills of other people. Also to be able to adapt traditional games to new technologies such as augmented reality.
What I learned
I learned to carry out a great project adapted to already written rules such as those of instagram.
What's next for Maze AR
I would like to add new mazes and new challenges like finding the objects in a limited time.
Built With
cinema4d
photoshop
sparkar
Try it out
www.instagram.com
www.facebook.com |
10,008 | https://devpost.com/software/what-arthur-are-you | Inspiration
I have always loved Arthur growing up so I wanted to try the challenge and create my own insta Arthur filter.
What it does
Socially connects people virtually so they can send each other which Arthur character they got.
How I built it
Using the tools on Spark AR and watching Youtube tutorials to guide me.
Challenges I ran into
Figuring out how to connect the diagram of steps together on Spark AR.
Accomplishments that I'm proud of
Building my own filter to connect more people throughout the world.
What I learned
I get excitement creating my own filter and I want to create more in the future.
What's next for What Arthur are you!
Adding another option to flip the camera so that the environment around each of the character changes with each interaction as well.
Built With
sparkar
Try it out
www.instagram.com |
10,008 | https://devpost.com/software/jungle-friends | Inspiration
I was inspired by projects showcase on the spark ar official website. For Jungle Friends I was inspired by the love of nature and how animals are friendly and joyous around people who love them. People and kids normally who aren't as fond with animals are scared of them hence they cannot pose with a real animal around or on them. Hence I used the capability of AR to bring that experience virtually
What it does
This world effect brings your friends from the Jungle to you! A cute baby Elephant dances on your ground and a happy monkey on your head which poses when you say cheese!
How I built it
I downloaded some models from the internet. Rigged the monkey and animated it from mixamo. However the Elephant wasn't rigged or animated, it was just like a static model.I used photoshop to create textures for the models. I used blender to modify the mesh and rig the elephant and animated the elephant to dance. I placed the Elephant and birds on a Plane Tracker so that elephant can share a common ground with the person trying to have a picture or video with it and the birds chirp and fly around above the elephant. I used head and bust occluder to place the monkey on the user. I used person segmentation to have occlusion with elephant, birds and monkey. Added some audio effects for fun and immersion.
Challenges I ran into
The animation and rigging of a static model was a bit of challenge as I did not have much of experience in doing so.. I spend around half a day in learning and animating the elephant for the desired dance animation.
The Monkey was previously placed on shoulders However the Head occluder occludes only a round part as shape of head hence the person looked as if they were bald. So I used person segmentation to avoid that problem. But This gave rise to another issue, now the monkey looked it was placed behind you instead of looking like sitting on your collar. I asked a query related to this on the Community page that is there a way to make a custom shaped mask around the person segmentation mask, But wasn't solved so I placed the Monkey on the head instead.
Well this was my First project on SparkAr thus it was a little tedious to get hang of the patch editor. Spark AR crashed a lot by the end of the project.
Accomplishments that I'm proud of
I'm proud that I learnt SparkAR, it's a very powerful tool for developing AR filters however some modifications are still needed. I'm proud to complete my first project as a part of this Hackathon. I am also happy to learn blender and animation in it.
What I learned
Spark AR
Blender and animation
Using Patch editor
What's next for Jungle Friends
Jungle Friends can have variety of animal friends as an option picker for user to select an animal that shares their ground. It can expect alot of users with their kids enjoying their pictures with their animal buddies.
Built With
blender
photoshop
spark-ar
Try it out
www.instagram.com
github.com |
10,008 | https://devpost.com/software/virtual-gallery | My inspiration was our lockdown, when all people had to stay home and be safe. Businesses had lost their money because nothing was working but artists had perfect opportunity to show their work.
You only need phone and Instagram account. This virtual gallery is a filter, it is possible to place it in a room, scale and rotate, after that - welcome inside.
I built it using Spark AR Hub and using Blender for 3D assets.
To fine art which fits all design.
I am very proud of this project, because it looks realistic and usually people use IG filters only to change colors or make skin smooth but this kind of project helps people, it is fun, it is AR and there is not much proper galleries like that. Also, people don't know that it is possible to walk inside, turn around and everything is in your room.
I think my next project will be something similar. I want to make a frame which would be possible to put on the wall and upload pictures from phone gallery, also through IG.
Built With
blender
sparkar |
10,008 | https://devpost.com/software/ar-school-for-kids | Inspiration
With no cure for coViD yet, most parents are considering having their kindergarteners skip this school year to avoid the risk. Well, being the teacher hasn't been easier at all either. Most countries are using video apps to learn but most schools here have come to a halt due to low internet connectivity majorly despite smartphone users greatly increasing. Since all one needs is to load, it doesn't cost a high internet charge and makes it interactive to learn and makes a parent's job easier.
What it does
The kid's curriculum includes learning alphabets, numbers, colors, etc. If today's lesson is alphabets, the child can open the camera and tap on the alphabets on a world view making it interesting and making the parents work easier + the child gets a fun way to learn and more memorable.
How I built it
Used Photoshop for materials such as the end screen and board. Used Spark AR for transforming my idea into reality; with help from Sketchfab for the 3D letters.
Challenges I ran into
Had a couple of errors while scripting and for the patch editor, some options could be simplified. I also have a problem with the end screen where when I set the visibility for completion of than 3 objects, it doesn't work.
Also had a UI picker where there was a colors and numbers options but the screen started flickering, maybe because of incompatibility with using the UI with a lot of 3D objects. For now, it's just the alphabets but we'll still work on making the UI picker work in future so there can be a variety of topics to learn.
Accomplishments that I'm proud of
Finishing the project and submitting it. Really proud. Used the patch editor despite being afraid it might be complex and will be using it more in the future.
What I learned
To use the patch editor and advanced features of Spark AR.
What's next for AR School For Kids
Introduction of more topics for kindergarteners such as foods, numbers & colors, and for other classes such as Math where there are equations and one taps on the right answer. Adding more music to sync with the objects
Built With
augmented-reality
Try it out
www.instagram.com |
10,008 | https://devpost.com/software/escape-room-y1egzb | Inspiration
Hello everyone!) A few years ago I was engaged in quests in real life, but then I decided that I want to do design more. Now, due to quarantine, there is no way to go somewhere to have fun. Therefore, I decided to make the quest ESCAPE ROOM directly on instagram.
What it does
ESCAPE ROOM is an AR game in which you need to get out of a room for a certain time. You have exactly 90 seconds to open the door and take the phone
How I built it
I tried to make all the logic in the game using SparkAr patches. Only the timer is made using a script.
To create 3D models, I used Cinema4D.
Some 3D models were purchased at
https://3ddd.ru/
For the sound effects, I downloaded some free to use audi and converted them to M4A mono format to be used in Spark AR studio
Challenges I ran into
It was difficult to fit in the dimensions suitable for instagram. But I dealt with it
Accomplishments that I'm proud of
I am very proud that I managed to do this project in SparkAr. I hope that in the future I will improve it.
What I learned
I learned a lot about creating AR effects In SparkAR Studio.
What's next for ESCAPE ROOM
Now I am working to ensure that this effect adapts to any language. I made a script that detects the language on your phone, and automatically puts it in a mask. Now I am working on implementing this in my filter.
Built With
cinema4d
javascript
sparkar
Try it out
drive.google.com
www.instagram.com |
10,008 | https://devpost.com/software/wayfarer-portal | Inspiration
One of the biggest issues faced in the current COVID pandemic season is the traveler's block, as travel ports and stations are closed to prevent the spread. The importance of maintaining social distancing as well as not letting the spirit to travel die alongside the fascination of what AR could achieve were the key inspirations behind the project.
What it does
The Wayfarer filter lets you port into various tourist destinations (currently 3 and expanding) via portals by just a tap on your screens. It then lets you have a look around and move around the area to view the surroundings. It also shows a 2m boundary ring around you to emphasize the importance of social distancing whilst travelling.
How I built it
The filter was built for Facebook and Instagram integration by using SparkAR and blender. The 360 images of the tourist destinations were captured using the Street View of Google maps and further optimized via photoshop and Spark AR. The radius rings were rendered in Blender.
Challenges I ran into
The generation of 360 images, its quality management, and optimization as well as its management in the world space caused a few delays but were considerably coped up with.
Accomplishments that I'm proud of
It was our first major SparkAR project and It currently works almost as neat as expected and is user friendly
What I learned
The benefits of AR, integration of 3d models and its rendering with Blender and SparkAR as well as image optimization
What's next for Wayfarer Portal
We plan to expand the number of tourist destinations. We also plan to incorporate 3D models of famous places in the area using Blender GIS which would further provide information on the tourist destination on its interaction with the user. Currently , the filter works in the world space, it would be expanded to work in the Camera Space so as to let the user feel as though they are in the environment around them on using the front camera as well.
Built With
blender
sparkar
Try it out
www.instagram.com
github.com |
10,008 | https://devpost.com/software/sphere-virtual-tour | Inspiration
Love to travel but can't due to pandemic diseases worldwide and also an enthusiasm in Augmented Reality technology
What it does
Display a sphere that contained 3 different places, tap on sphere to view places in virtually
How I built it
Using spark AR with patches
Challenges I ran into
What to do(idea's), new into Spark AR since I using Spark AR as my hobby and times constraint
Accomplishments that I'm proud of
Able to finish the project for the first time related with world effect
What I learned
There a lot of resources and tutorial provided by Spark AR Creator in facebook group and youtube
What's next for Sphere Virtual Tour
Adding audio for each virtual places
Built With
ar
particle
Try it out
github.com
www.instagram.com |
10,008 | https://devpost.com/software/namaste-1ob5hl | Namaste
Namaste using Frame of adding photo and video in surface and adding instructions tap to place .
Namaste using Frame of adding photo and video in table as object and tap to place instruction
I had added marigold flower as particle in Namaste Concept
Inspiration
NAMASTE
INSTAGRAM AR WORLD EFFECT : NAMASTE
INSPIRATION OF NAMASTE
1)The inspiration of NAMSTE came from my country INDIA .
2)NAMASTE is about The Light Within Me Honours The Light Within You
3)In my country INDIA had survived with Namaste because they always deal with positivity, caring ,honored and smile with the situation of circumstances .
4)when we do Namaste ,First things come up with positive creation that i bow to you and win the people heart by realisoing that there are my own people.
IDEA OF NAMASTE
-The main idea behind of NAMASTE is seeing Current Situation of GLOBAL Pandemic Year of 2020.
-I want to apply idea of NAMSTE by showing that we care about you ,win the people heart and have smile own their face .
IMPACT OF GLOBAL PANDEMIC IN SOCIETY OR COMMUNITY
Negative impact is happened in society , community ,organisations and world as follow:
1)Rise of suicidal case
2)Negative impact in financial
3)Depression
4)Violence and Protest is happening
5)WAR
6)Death rate is rises due to panic of seeing future.What will happen Next or how we survive
BENEFITS OF USING NAMASTE
1)Create Positivity
2) StrongRelationship with your friend, family, organistaion ,society and community
3)Unity
4)Honor
5)Ownerships to connection of unknown people
6)Caring,Love,Laugh and Alive
-One thing I can guarantee about it this Namaste filter it can give good impact in society and community that to spread this happiness ,peace and positive which is now missing due to pandemic of year 2020.
What it does
** Working of Namaste Instagram Filter in AR World Effect**
Presentation of Namaste Instagram Filter
1)I had use concept of Indian celebration where People are dancing to show their happiness and gestures .I had added gallery so that they can get involved in the celebration.
2)I want the user to add photo or video as sweet moment to share their Namaste filter to near and dear one it gives smile on face and get strongly connected.
3)I had added marigold flower to make them presence of the moment that there are in .It would make good connection to the relationship, society and community
4)It can give good impact to society and community because everyone need and honour,recognition ,love and positive life.
How I built it
I built Namaste Instagram AR WORLD EFFECT filter
Software
**
1) Spark AR
2)Adobe Photoshop
3)Blender
**Construction of NAMSTE filter
*
1)I had used 9 animation sequence series for dancing, frame and designs.i had used adobe photoshop for texture and manipulating textures.
2) I had used plane tracker template which was in built functionality instructions and patches of switch to the camera on surface, tap to place ,you can zoom ,you can pan and you can extend size of image.
3)I had used the switch on/off patches a lot for design and marigold flower emitter.
4) had used 3d object cylinder in blender software as loop animation of 360 degree rotation.
Challenges I ran into
1)Reducing the size of texture .the size of file should be 4mb in Instagram AR filter
2)I was having difficulties with null object of 360 degree in surface which was quite challenging.some part of cylinder 3d model object got success but not in plane of 360 degree I want used as dance performance in plane of 360 degree using animation sequence of it .I got unsuccess in that area
4)I had to face challenge of last completion in Namaste filter using emitter of marigold flower.i am not satisfied with it.
Because I started late only .
Accomplishments that I'm proud of
I am proud of making Stage of NAMASTE Filter where you can add photo or video .I also like to show dance which I am proud of it.i want to do more. I got addicted which I am also proud of it .The imagination of Namaste which I want half got completed which I am proud of it.Half I will surely do it again republish of Namaste Filter.
I am proud I am enjoying Spark AR
I Love to do more
What I learned
During Making Namaste Instagram filter.I learned is to get in your imagination and try to create them whatever you know it.
-I don't know 3D Model Object .I Learned that it was quite interesting during creation of Namaste Instagram filter how to make them and manipulate 3D model using blender.I would like to thanks spark ar community for giving knowledge 3D Model.For me it was difficult.
What's next for NAMASTE
-I am planning to enhancement of namaste filter in Instagram AR filter by removing emitter of marigold flower .Second I will used different technique of marigold flower emitter.
I would make dance performances using 360 rotation of moment in Namaste Instagram filter.I would update this filter in year 2020
I would add yoga video classes of namaste.So that people would take care of health by using native ui
Built With
adobe
blender
particle
photshop
Try it out
www.instagram.com
github.com |
10,008 | https://devpost.com/software/red_ant | Inspiration
I just want to participate in this hackathon and want to gain some experience.
What it does
It will give a black ant all over the view
How I built it
Spark ar studio and Facebook
Challenges I ran into
First time I didn't get how to use spark ar.
Accomplishments that I'm proud of
Finally, I get to learn how to use the spark ar software and it was quite interesting
What I learned
I have learned Spark ar and some basic photoshop
What's next for Red_Ant
Nature will cover up with black ant
Built With
instagram
photoshop
spark-ar
Try it out
github.com |
10,008 | https://devpost.com/software/traffic-3e2fhb | aeroplane
Inspiration
in this lockdown wanted to see planes flying hence came up with idea of this
What it does
move it around and make it fly
How I built it
using spark ar
Challenges I ran into
i had taken modelling as my challenge in blender
Accomplishments that I'm proud of
i could make it move around
What I learned
to use spark ar and build ar effects
What's next for traffic
Built With
ar
particle
Try it out
www.instagram.com |
10,008 | https://devpost.com/software/the-castle-of-knowledge | Inspiration
We made this game to create an AR world effect that give users general knowledge by answering different questions in different domains.
What it does
The Castle Of Knowledge is a simple AR game revolves around being as a player inside a cage above of a pit full with arrows and scorpions, also there is a knight who gives you questions you must answer it correctly to win prizes and in last gain your freedom by getting out of the castle but if your answer is wrong you will fall down.
How we built it
The game was built in Spark AR Studio using javascript to write the game logic and Patch editor to control object animations.
To create and customize the 3D models, We used Blender.
Challenges we ran into
Spark AR does not have the ability to create custom 3D objects also scripting API need more functions (Speech recognition in audio interactions).
Also we found that Facebook disabled Networking module, so we decide to change some game logics because we planed to fetch questions and answers from our servers.
Accomplishments that we're proud of
Create the game scripting and concept from scratch using Spark AR Studio for the first time that is we are proud of.
What we learned
I learned a lot about creating AR effects In Spark AR Studio.
What's next for The Castle Of Knowledge
Next, we plan to add more complicated questions and made a backend app to store and fetch questions from the internet using Networking API (because now is disabled by Facebook for security reasons) and also update UI of game.
Built With
blender
javascript
sketchfab
spark-ar
Try it out
github.com
www.facebook.com |
10,008 | https://devpost.com/software/where-to-go-in-paris-spark-ar-world-effect | Cover
Previews
Patch editor
Inspiration
The main inspiration behind this effect was popular randomizer effects and AR maps that already exist on different platforms
What it does
I wanted to build a World AR effect in Spark AR Studio using a still very popular Randomizer mechanic. To use this effect you need to switch to your back camera and record a video to reveal your destination.
How I built it
It was pretty easy to built this effect thanks to Spark AR Studio’s built-in templates and free assets. As a base I used World Object template. I’ve changed couple of things, than I added the map overlay that I’ve created in Photoshop.To make my AR map more interesting and fun I’ve used the sound effects and 3D models which you can find in the AR Library in Spark AR Studio. To build the randomizer mechanic I’ve used Patch editor which is really simple and after some time you can build even more complex effects using it.
Challenges I ran into
While creating this effect I found that I have to push the effect to my device each time when I need to test "record to reveal" feature, I wish I could test it inside Spark AR simulator.
Accomplishments that I'm proud of
This is my first time creating World AR + Randomier effect. It is also my first time showing it on Facebook Hackathon :)
What I learned
I learned more about Patch editor ("Round" patch is underrated)
What's next for Where To Go In Paris – Spark AR World Effect
To learn more about it:
GitHub Repository:
https://github.com/maximkuzlin/fbhack_wheretogoinparis
Try it on IG:
https://www.instagram.com/ar/315215792846855/
In future I want to extend this idea, add more places, reduсe the size of the effect.
Built With
blender
photoshop
sketchup
sparkar
Try it out
www.instagram.com
github.com |
10,008 | https://devpost.com/software/simplesparkar | Inspiration
Always wanted to do mobile applications on AR. Tried Unity and Babylon. Intrigued by Spark AR.
What it does
Very Simple Application just tried to demonstrate Spark AR
How I built it
Using Spark AR PlaneTracker and Animations. Unfortunately, many modules like LiveStreamingModule, HandTrackingModule are not supported on Instagram.
Challenges I ran into
Unfortunately, many modules like LiveStreamingModule, HandTrackingModule are not supported on Instagram. So i am constrained to used very limited modules
Accomplishments that I'm proud of
Augmented Reality and my first instagram filter.
What I learned
Spark AR
What's next for SimpleSparkAR
Waiting on Support for LiveStreaming, trying to integrate AI with LiveStreaming
Built With
sparkar
Try it out
www.instagram.com |
10,008 | https://devpost.com/software/black-mirror-uc97xf | Online memories
Inspiration
Hello, my name is Hila Peled and I'm from Israel.
As a girl, I've always heard beautiful stories from my mom about my great-grandmother and grandmother,
And in the end she always say that towards the end of their lives, they did not recognize her because of Alzheimer's disease,
I saw on my mother's face that it was sad and traumatic for her and my grandmothers too.
I recently entered the AR world as a hobby and slowly fell in love with it :)
When I saw that Hackathon offered solutions and support to communities I thought it could be very nice if we could integrate the AR world and maybe help even one person.
What it does
The filter recognizes the user's face and opens up virtual memories that the user has selected
(Picture and Videos).
How I built it
I built the project with Spark AR software.
Challenges I ran into
I tried to link the possibility of several different pictures
But I didn't succeedץ
Accomplishments that I'm proud of
I am proud that I managed to submit my own idea
What I learned
I started learning object-oriented code writing
I could not submit it
What's next for Online memories
The next fro Online memories
Is the option to save your time tunnel by facial recognition
And all pictures will appear on the monitor
Built With
ar
c++
javascript
particle
Try it out
www.instagram.com |
10,008 | https://devpost.com/software/distancar | demo
Inspiration
When going out we noticed that at many places social distancing was not maintained. These are really tough times, if we dont follow basic guidlines we would be in grave danger. We thought why cant AR help in this problem
What it does
It creates a 3D sphere on the bottom part of the screen, so when the phone is held and back camera is switched on you know what distance is a safe distance and what is not.
How I built it
Created the 3D Sphere using Blender, later added everything to sparkar to create this back camera effect.
Challenges I ran into
Getting SparkAR Camera x,z positions
Accomplishments that I'm proud of
My first sparkAR hackathon submission
What I learned
SparkAR, Blender
What's next for DistancAR
Thinking of adding more utility for the post corona world.
Built With
blender
sparkar |
10,008 | https://devpost.com/software/football-kick | GIF
Inspiration Well, I am tech and Football lover. So, I try to learn both and be good in both. So now I have tried to combine both. I have created this so that I could help young people with less resources to learn how to play perfectly. This starts with kicking and later I will add further effects to help kids learn. Actually, when I played and tried to kick like the Messi or Ronaldo, it was very difficult. I was not going to any academy. So it was difficult to know what we did wrong even after watching their videos. So I am trying to solve this problem.
What it does I have built something very simple and with minimal coding. It is very simple to use because it is very sporty people who may not like using difficult technology. Here we allow players to play with Ronaldo. Actually, we change the total background and in between there Ronaldo kicks his knuckle ball, So, kids can also kick with him and they can see comparatively what is the difference in both kicks. So that they can correct it soon. They go into professional surrounding to give them a feeling of pleasure to play at a higher level. This is a very good form of tutorial, but I feel if this would have been for Oculus VR then we could have done much and it could benefit kids more, but it is good for kids with minimum resources.
How I built it I didn't have much experience with Spark Ar so I used Youtube for many things. I built by finding the best kick, then taking the frames from it then using to animate the background of the video. Spark AR works great for the effect. It knows which parts of the front to show and which not. This is really great.
Challenges I ran into I didn't really go into much challenges with this effect.
Accomplishments that I'm proud of is that I created my own effect for Instagram and Facebook. This is really awesome and I can help kids learn how to play Football and later some of them may become great footballers.
What I learned I learnt to use Spark AR and now I am exploring it to do more things. i
What's next for Football Kick Next, is other players and other kicks and then other stuff to play better Football and make kids star footballers.
Built With
segmentation
Try it out
github.com |
10,008 | https://devpost.com/software/truth-or-dare-wz7ske | Coin Spin
Jenga 1
Dare 7
Dare 6
Dare 2
Dare 1
Dare 3
Dare 5
Jenga 2
Dare 4
Spin Wheel
Truth
Inspiration
We intended to connect people during this difficult time more than ever. What other game to connect people with other than a fun truth or dare game. We acknowledge there are quite a number of truth or dare games out there. But we have our own unique touch to the game. We bring the familiarity of an offline game such as a Jenga, Spin wheel , and a coin toss to an online experience.
What it does
Here is the basic flow of the effect:
User finds a plane using the rear camera.
User chooses which type of object to place.
User taps screen to place the object.
User taps on the object to get a truth or dare card.
User is asked to find a face for the task.
User can see task and record and send it to friends or save it.
User can long press the screen to reset the effect and pick a new truth or dare.
We have 10 custom dares and 20 custom truths.
Each dare is a filter in a filter. We intended to make performing dares as fun and simple as possible. Here are the dares:
Sing a song. (Mic shows up)
Do a fake cry. (Fake Tears)
Pretend to be sick. (Sick Mask)
Do your best impression of a baby. (Pacifier)
Describe your crush. (Blush)
Record a Hi with a person beside you. (BFF Glow effect)
Speak about yourself in Third Person. (Mic shows up)
Talk without your lips making contact. (Shows when user touches lips)
Laugh out loud continuously for 30 seconds. (Spray of laugh emoji)
Break an egg on your head. (Tap screen to break head on the head)
For the truth we used the audio analyzer which serves as a polygraph. Changes as the user speaks. This makes answering truths even more fun than before.
Here are the Truths
What do most people think about you is true but is not?
What was the last thing you googled?
If someone gave you a million dollars, what would you do with it?
What is your favorite R rated movie?
What would you do if you switched genders for a day?
Describe your guilty pleasure.
What is the stupidest thing you have ever done?
What would be the theme song of your life?
What have you seen that you wish you could unsee?
Describe your favorite meme by facial expressions.
If you could be a celebrity for a whole day, who would you be?
What is your biggest regret?
What was the most defining moment of your life?
What was the highlight of your day/week?
Describe your most drunk experience.
What is one thing you are glad that your mum doesn’t know?
What book/movie character has influenced you the most?
What was your dream job growing up?
Whose Instagram account would you want to manage for a day and what would you do if you managed it?
What is your happiest memory?
We also introduced a scoring system to help people connect with each other more frequently. The more a user shares the effect, the higher the score the user gets. We have 3 categories of score. Gold, Silver, Bronze. Share more than 5 times - Silver. More than 15 times - Gold.
How we built it
We used Spark AR to come up with this effect.
We used both, the patch editor and Scripting. The patch editor uses SDFs, audio patches,animations, instructions and much more. There are a total of about 100+ patches.
For scripting, we used this to mainly design the logic of the randomiser. We designed the randomiser in such a way that the user will not get the same truth or dare in the same session of usage.
To build the 3D models, we used Blender and to build the custom textures we used Photoshop and Adobe XD.
Challenges we ran into
There were a lot of challenges we ran into.
Size of the assets. We designed specific assets which would occupy less amount of space. So designing each asset with that in mind was quite a constraint in our development process.
Number of patches. As we kept implementing filter within a filter (Inception vibes much? :P) our patch editor kept growing to a staggering size. Organizing and managing the patches were a particular challenge which we solved by commenting appropriately and using a few blocks.
Animating. We wanted the experience to feel familiar to the users. So we had to animate every Jenga block individually to achieve that.
Patch-Script bridging. This always had to be organised with top priority because of our heavy reliance on their links.
Accomplishments that we're proud of
Creating a good performance filter after integrating all this content in it.
Keeping the size of the effect under 3 MB.
Integrating filters upon filters.
Organizing code and patches very well.
Able to adapt to new technologies and implementing them.
Performing seamless animation.
Achieving what vision we set out to do.
What we learned
It isn't easy optimising effects to keep it under 4MB.
Learned about the audio analyser and patch- script bridging.
Learned how team work can truly create something amazing.
Learned how we can be a positive influence on our society.
Learned 3D Modeling.
And so much more.
What's next for Truth or Dare
This effect has a lot of potential to last a long time without people getting bored of it. We can update it with new dares and truths occasionally. We can add in more advanced features as they show up to make the effect upto date.
Overall, it was a great experience building this effect. And we are proud of this.
Built With
javascript
particle
patches
Try it out
www.instagram.com |
10,008 | https://devpost.com/software/safety-net-game | Safety Net Game
Hello my name is Andrew Baker, I'm excited to share with you my entry the Safety Net Game. I thought really hard about how I could in fact get across a positive message to my community, especially during such a negative time across the globe. I was inspired by the safety net fund which is a local grant in the bay area of San Francisco that is doing a really great job providing support to artists in the community that have needed assistance during the pandemic.
The concept of the Safety Net Game is to raise awareness and support for small businesses. The mom and pops are really important in a thriving community and many of them as I'm sure you know are struggling. So that is the basis of my AR game, now let's get into the actual content of the game.
So when opened, the viewfinder will display the safety net at the bottom of the screen. Tapping to start the game will initiate the play as the elements rain down from the sky. As you can see, the user will rank up in score each time they successfully catch the falling small businesses. But be careful because unfortunately, as we all know too well, covid is still very much relevant and caution is necessary. So we must avoid catching it or else it's game over. But tapping will let you replay.
In Spark AR, the score is displayed via canvas utilizing a script and counter. The 3d elements are a loop animation with a random function set to the duration to make their movement speed up and slow down making it more tricky and less predictable. Initiating the pulse is done by carefully setting values to interpret the device motion of the gyroscope and sending the pulse through a series of and/or functions when the phone is pointed at one of the 3d objects and finally, the ending card is a small texture sequence animation.
Thank you very much for your time and consideration, I hope you enjoy it!
Inspiration
The Safety Net Fund
What it does
A catch falling objects game
How I built it
Spark AR Studio
Challenges I ran into
Scripting and my first game
Accomplishments that I'm proud of
I come from VFX and a pretty visual background and the structure of Spark AR is very similar in most terms to 3d software, but I don't know any coding. I saved the Devpost page the day I saw it, knowing that I wanted to participate in what would be a big step for me and I am proud that I was able to follow through and achieve what I set out. It has been a life goal of mine to make a game and I honestly never expected it to happen here in 2020 during a pandemic, but here we are - and I think it's been a great opportunity for me to grow.
What I learned
Scripting for a score count and location targeting for when objects are caught.
What's next for Safety Net Game
Spreading awareness for all the small businesses out there!
Andrew Baker
owner ID: 1090467035
effect ID: 3286420908088842
Built With
sparkar
Try it out
www.instagram.com
github.com |
10,008 | https://devpost.com/software/flowerz | Inspiration
Darkness is bad
What it does
It change darkness
How I built it
Spark ar
Challenges I ran into
Original idea was the flowers rotating
Accomplishments that I'm proud of
It works as expected
What I learned
3d obj can't be user ad a source view
What's next for Flowerz
I'll do a version share user can use gallery image
Built With
sparkar
Try it out
www.instagram.com |
10,008 | https://devpost.com/software/sunny-beach | legs
Sand and sun
Book, glasses, water
Due to everyday life, domestic fuss, without the opportunity to leave for another country, bad news in the world, a person wants to escape / get distracted / forget somewhere.
Therefore, I created my little world, where there will be a clear sea, a deserted beach, the sun, my favorite book and the sound of waves.
I made a 3D model of a girl, an umbrella, little table, books, ball.
Took a sphere from the library Spark ar.
Used the picture for the environment.
Set the desired lighting.
Added the sound of waves.
Set up the location of objects. Done!
Faced the ability to download video environments, because the size of the effect is limited to 4 MB and the project to 40 MB.
That is, either a 3D model or a video effect.
The essence of this filter is to relax and enjoy the surroundings and the sound of the waves. No politics, problems in the world and bad news. Enjoy!
P.s.: Thank you for your attention and forgive me for bad English :)
Built With
blender
particle
photoshop
Try it out
www.instagram.com
www.instagram.com |
10,008 | https://devpost.com/software/vrsnow | Snow Flake
Inspiration
I live in an environment where you can only dream about snow. So i want to atleast have an video memory of my environment with snow. So this lead me in creating VRSnow.
What it does
It basically creates a Virtual Snowy environment with Effects of real snowflakes falling before you in all directions.
How I built it
I used sparkAR , the coolest software to create these world effect.
Challenges I ran into
I was a total newbie. I just learnt this on this hackathon... so pretty much I didnt know how to use the sparkAR interface. I learnt it from the docs, youtube.. but after the struggles everything was a cake walk.
Accomplishments that I'm proud of
I am very proud to have created this filter for a new user of sparkAR
What I learned
I learnt to use overlays, spin transition etc.
What's next for VRSnow
I want to add some other animations for more realistic experience.
Built With
sparkar
Try it out
www.instagram.com |
10,008 | https://devpost.com/software/randome | Find the Random Me to see whats inside Randome.
Inspiration
I am creative (idea wise) and i am a big fan of randomness as many other people are so i thought of going along lines of randomness give some importance to _ enrichment _ also in this project.
I just thought of diving into AR and i realized the following.
*
Spark Ar + Social Media + Creativity = Great results *
Trust me its been just 3 days and i have learnt, have a lot of ideas and lots to present.
What it does
Randome
- _ Its a world AR , You have a random mystery box with some good enriching or realistic phrases. _
You start with front camera you get a request to go to back camera.
you start back camera , _ there are variety of animated,static objects and particle system _.The user has to search for the animated mystery box and get the reward. I have included little audio for some interactions. The reward (text) appears randomly,there are 20 such text objects to improve randomness.
When you search for the box and click on it,it breaks open and the text appears.
I have included
Trees
which people will think why ? Just to get them the feel of environment and just to make users feel that there is something known as Trees and we should pay attention to our environment also.
How I built it
One
day went in reading about Spark AR its materials and its tutorials.
The next
two days
i worked on my idea and implemented it although not that fully.
Some assets i downloaded from unity asset store and most of them are from AR library.
I built the project step by step first starting scene,then the searching scene,then the breaking scene.
First build a basic structure then went on for animations and materials.
I completely used patch editor for the coding part.
Challenges I ran into
_ As mentioned i am into this since past 3 days and i had no time to practice a lot _
so i directly jumped on in the project and started working and learning side by side.
I still don't know the entire software and all its utilities.
Very few content available for learning i felt.
Accomplishments that I'm proud of
** I had a great idea ** which i presented somewhat in an OK manner.
What I learned
I learned some AR and how to think for projects and what is the approach and how to proceed with the build up of the idea.
What's next for Randome
I have
HUGE HUGE PLANS
i want to increase randomness to the next level.
Even randomize the stuff the scene around people.
I want the texts to be unlimited or provide users to put in inputs which are stored somewhere and the project automatically picks some input from there and then shows. Some more classy animation.Increase difficulty of finding the stuff + make people enjoy the effort and i want to make many more projects along this line.
Built With
andmyownidea
english
unity
Try it out
github.com |
10,008 | https://devpost.com/software/interdimension | Basic appearance of the effect
Inspiration
I wanted to create a bunch of things that were commonly used by everyone and that were not commonly created by everyone. I could easily see that, just after my first filter was approved, a lot of my friends were very much amazed to see that. That lead me create more and more filters.
What it does
Interdimension is a visual effects filter made for instagram using Spark AR. It has effects such as sparkles, bokeh and red&blue shades altogether that makes the effect look very cool.
What's next for Interdimension
I can make the effect customizable and interactive as the next step.
Built With
augmented-reality
sparkar
Try it out
www.instagram.com |
10,008 | https://devpost.com/software/boombox-teymf6 | Inspiration: I wanted to create a new effect which is kind of upbeat and fun
What it does: A world effect which the user can control freely using gestures
How I built it: Using Spark AR and JavaScript
Challenges I ran into: Was new to Spark AR platform and it took time to understand the platform
Accomplishments that I'm proud of: Created a working prototype of the world effect
What I learned: Using Spark AR and coding in JavaScript
What's next for BoomBox: Customizing it and adding new cooler features on top of it
Built With
javascript
particle
Try it out
github.com |
10,008 | https://devpost.com/software/guess-the-answer | The history behind the development of this project is that we got inspired from a co-creator in Instagram who developed
'Guess the gibberish' filter. This gave us the motivation to build a filter that is much more interactive with the user. The project that we developed is an Instagram filter which asks a question and displays the answer after a particular time.
This increases the interaction between the user and the app environment . We used Facebook's SparkAR software to build our project which made our task a bit easier. While coming to the challenges part that we faced at the beginning stage of our project was confusing on the usage of the software, but later after studying the documentation in the SparkAR website it was quite easy and understandable on the functions and its usage. This particular project of ours took a massive amount of time as investment and we did a lot of trials to make it work as we expected to do so. Finally this project was submitted and accepted on Instagram after all the necessary testings are done. Our filter also has reached almost 100K+ (impressions and opens) views on Instagram. During this journey of developing this project we have learnt to be patient and consistent in what we want to achieve . Currently we are also focusing on our next venture on Augmented Reality and will soon be published in Instagram.
Built With
javascript
patcheditor
sparkar
Try it out
www.instagram.com
www.instagram.com
github.com |
10,008 | https://devpost.com/software/holocard | HoloCard Template
Demo Screenshot
Inspiration
The inspiration for creating this effect came from my daily life schedule and my interest in the Augmented reality as a technology and how it is changing the way we perceive and presents things.
What it does
This effect presents an idea of getting the flexibility to get the information about a credit/debit card(here the HoloCard) without going to any app and with ease.
How I built it
I built it using the "Fixed Target Tracker" object in Spark AR and photoshop.
Challenges I ran into
The major challenge I ran into was with "Fixed Target Tracker" that it was not able to detect many of the design I use to make and thus had to change multiple designs to reach the final design of HoloCard so the tracker can work properly.
Accomplishments that I'm proud of
Developing AR effects that can be showcased as MVP in very minimal time.
What I learned
Learnt a new way to develop AR with very fewer time consumptions and very minimal coding than the other ways been using till now
What's next for HoloCard
The next for HoloCard is to provide new features as:
Customizability by the user
Give the ability to target multiple target images and not just one
Connecting with API to get more real-time data
Built With
sparkar
Try it out
www.instagram.com
github.com
www.dropbox.com |
10,008 | https://devpost.com/software/lockdown-covid19 | Inspiration
The pandemic indiced me to make a cool game during the lockdown
What it does
In this game based effect, you have to dodge all coronoviruses you encounter and may your own way out of the havoc caused by the pandemic. This effect is based on a face tracker where you have to tilt your head from side to side to control the movement of the bike. Also make sure you don't stay outside the road for 3 seconds or more. Everybody is bored during the lockdown. So this cool filter is one way you could engage in some fun avtivity adn battle the viruses. ENJOY!!!
How I built it
Gameplay Patch:
The gameplay patch is the main patch for simulating the gameplay. It contains the road animation, moving objects and character animation patches. The road animation is used to simulate a running vehicle on a road. It contains about 45 frames of road images which are extracted from a gif found online and are simulated using the Loop animation and transition patches connected to the current frame of the road. The start and end values run from 0-45.
The character animation patch is used to control the movements of the character as he moves his way on the road. Here I have used a face finder, face select, face tracker patches in my graph to capture the face of the user and control the character position as the user tilts his/her head side to side. Here I have used a -1 multiplier to turn the character laterally in the direction of the users head during the tilting action. The patches used here are the character position, 3D head (face mesh) rotation and character 3D rotation which are all synchronised with the face tracker.
The character animation also controls besides users movement, that the user shouldn’t cross the width of the road for more than three seconds, the violation of which kills the user by indicating game over.
The moving objects patch contols the On/Off, duration and reset parameters of various colliding objects in our interaction. Some of the interacting objects include coronaviruses, trees, street and the cityscape during the night. Some occur on the ground and some in the air.
Challenges I ran into
I had to overcome various challenges like placing and moving the charcter in the 3D space. simulating the road animations, increasing the speed of the gameplay, making the colliding objects appear at randoim intervals of time and of course making the animations and assets, reactive scripting and optimising the effect to be compatible with the instagram platform.
Accomplishments that I'm proud of
I am proud that I was able to set the right targets and achieve them. I got to the opportunity to learn to make amazing effects using sparkAR and understand how it can be used to promote a message and engage audiences. I improved my understanding of the 3D geometry and coordinate space.
What I learned
I learned various features and functionalities of SparkAR and its scripting using reactive programming and also 3D model creations using blender and animations with sparkAR.
What's next for Lockdown-Covid19
I would like to involve a blinking feature to enable the bike to jump over obstacles and also increase the frequency of the obstacles in the effect when the user opens his/her mouth indicating danger and add more music effects and better assets to improve the reach.
Built With
graph
javascript
patch
sparkar
Try it out
www.instagram.com
github.com |
10,008 | https://devpost.com/software/donation-visualizer | The project logo
A donation of one dollar buys one N95 mask
10 dollars buys 14 N95 masks
A close up of the masks
Inspiration
During this pandemic, we see all the wonderful opportunities to donate to a COVID-19 relief fund. But what are these donations being used for? The most common answer is PPE - Personal Protective Equipment, namely, N95 masks.
What it does
Donation Visualizer is an Augmented Reality Instagram effect that showcases how many N95 masks a specific donation amount ($1, $10, $25, $50) can buy. It shows the life-sized masks piling up virtually in real space!
How we built it
The first step was to calculate the cost of an N95 mask, which varies by region, and the pandemic causes prices to fluctuate based on supply and demand. We assume that charities will buy in bulk to save money, so we found the cost of a large pack of N95 masks, and divided that by the quantity of masks in the pack. This gave us an average cost of $0.71 per mask.
Then, we 3D modelled the masks in Blender and animated them using a rigid body simulation so the masks fall as they would in real life. We baked the animations and exported them as .fbx files.
Lastly, we imported the .fbx files into SparkAR and used the Native UI slider to toggle the animation, visibility, and added 2D stickers showcasing the dollar amounts in the top right corner.
Challenges we ran into
The textures on the mask in blender were not importing properly into SparkAR, so we had to make the materials in SparkAR. In addition, the slider was a learning curve and we are happy to have gotten it to work.
Accomplishments that we're proud of
We are happy with creating an augmented reality effect that can make a positive impact on the world.
What we learned
We learned about User Experience Design through our usage of the Native UI Slider, and the 2D stickers!
What's next for Donation Visualizer
Next, we would like to include other forms of PPE, such as face shields and gloves.
Built With
blender
sparkar
Try it out
www.instagram.com |
10,008 | https://devpost.com/software/ghost-effect | Ghost Effect Cover
Inspiration
Horror movies excite many of us, right ? So, here I am with my Ghost Effect filter. I got this idea of making a ghost filter while watching a
horror show
What it does
This effect uses
face tracker
to place the ghost mask on user's face. Meanwhile, background music plays continuously which is a scary sound. The effect instructs the user to open their mouth. When the user opens their mouth, a horror sound effect plays till the time mouth is open. As soon as the mouth gets closed, the horror sound stops, but the background sound is still playing.
How I built it
I built this effect using
Spark AR
. In this, I have used face tracker which is used to put the ghost mask onto the user's face. Also, I've added two background sound effects out of which one plays continuously while the other plays only when the user opens his/her mouth.
This makes it feel more scary !
Challenges I ran into
The only challenge which I faced were adding sound effect on users's mouth gestures. Well,
thanks
to
Spark AR Documentation
which helped me a lot in overcoming this challenge.
Accomplishments that I'm proud of
I'm proud that I created my first filter for Instagram using Spark AR. And the best part of it is that it's approved by Spark AR Hub and is now live on Instagram. So, why wait, go and try this Ghost effect
here
What I learned
I learnt how to create
cool
filters for Facebook and Instagram using Spark AR. And this
excites
me to dive in further into the world of AR.
What's next for Ghost Effect
I'm planning to make this effect more interactive by adding some
flying bats and scary trees
in the background.
Built With
sparkar
Try it out
www.instagram.com |
10,008 | https://devpost.com/software/test-x8g46r | Inspiration
I created this AR resource inspired by the pursuit to create an equal society, inspired by black lives (which matter). Being anti-racist results from a conscious decision to make frequent, consistent, equitable choices daily. These choices require ongoing self-awareness, self-reflection, and education. In my own learning journey, I created an Anti-Racism learning experience to support the education of others as well.
What it does
In 4 parts, the experience is designed to explain what racism is, acknowledge how racism appears in modern life, define the anti-racism learning journey, and show additional resources for becoming anti-racist.
How I built it
I created 3D models of anti-racism learning visuals that can be experienced in space and imported them in SparkAR. Starting with the World Space template, I used the patch editor and scripts to create more Picker options and customize objects. I used a glass patch by
Mate Steinforth
. I used Blender, Adobe Photoshop, and 3D Paint to make the objects. For the background sound, I downloaded free audio from Youtube's media
library
, and converted it into m4a mono format for Spark AR.
Challenges I ran into / Accomplishments
It was my first time creating 3D models, creating a story in AR, and creating a world space effect. I spent several weeks learning how to make 3D objects, learning Spark AR, and creating/curating the content. I had some trouble with file sizing to meet the 4mb requirements, although I'm glad I was able to make it work! I learned a lot and am thrilled with the result.
What I learned
I learned that being racist or antiracist is not about who you are; it is about what you do. I learned that becoming anti-racist is an ongoing practice and process. I also learned how to bring ideas into reality in augmented reality!
What's next
I would love to be able to link out to the additional anti-racism resources mentioned within the experience! For example, inside the effect, a user would be able to tap on an organization shown and be taken outside the effect to the organization's web page for more information.
Built With
sparkar
Try it out
www.instagram.com |
10,008 | https://devpost.com/software/mydev | Inspiration
Since I was a teen I always noticed the little things in someone's behavior and I was disappointed by rude manners. So along with my love for Sci-Fi genre and recent events, I wanted to bring to life a futuristic touch to our daily routine by the use of a filter of a robot, MyDev. The name MyDev stands for my development, meaning than its main course is to urge people to be more kind and acceptive of others, to value simple things in their lives, to take action against injustice, to love themselves, and to be more supportive.
What it does.
Using world-tracking on the back camera MyDev robot is displayed with a unique mantra floating upon it each time that can be changed by tapping the face of the robot. It's also reading out loud the current mantra with a cute robotic voice.
It includes a LUT filter to bring a more Sci-fi look to life as well as an audio analyzer for RGB shift shader that will match for someone who wants to discuss recent issues.
How I built it
It was built with SparkAR and Blender.
Challenges I ran into
The bigger challenge was to make the robot and the mantras to look like a single realistic object. The
patch editor gave the touch I wanted it to have as a floating 3d mantra.
Accomplishments that I'm proud of
I'm really satisfied that I accomplished to make a meaningful filter for a good cause. I hope it really makes people feel more optimistic and to remind them that they are loved and cared for.
What I learned
I realized that you can use such an interactive tool to bring awareness about critical issues. Since it was my first time creating such a filter I learned how to use world-tracking, how to reduce 3d objects, and how to connect them as a single looking object.
What's next for MyDev
I aspire that some activists would like to use my filter to discuss injustices and to bring perception and apprehension about large matters that occupy our modern life. I think that such an interactive experience would attract more people to join.
Built With
blender
sparkar
Try it out
www.instagram.com |
10,008 | https://devpost.com/software/virus-mask-instagram-ar-filter | Virus Mask AR Effect
Inspiration
Trying the Spark AR interface I decided to create an AR instagram effect to promote the mask wearing to my relatives abroad.
What it does
It's an AR filter advocating the wearing of protective masks during the pandemic.
How I built it
I used 3 software: Blender to edit the 3D models, GIMP to edit the textures & Spark AR Studio to create the AR filter.
Challenges I ran into
This is my first AR effect for Instagram, it was very interesting process. It allowed me to test the Spark AR platform.
Accomplishments that I'm proud of
My AR creation could help some people in the real world to avoid contaminations.
What I learned
The Spark AR interface is very intuitive and easy to use. Only the animation weight could represent some difficulties, i solved it by compressing the 3D models and textures before importing them.
What's next for Virus Mask Instagram AR filter
I will update this AR effect when I'll receive feedbacks.
Built With
blender
gimp
sparkar
Try it out
www.instagram.com
github.com |
10,008 | https://devpost.com/software/ar-social-distancing | Inspiration
We are required to follow the lockdown guidelines and social distancing norms to stop the spread of the coronavirus. The United Nations made an app called 1point5 to help you achieve a distance of 1.5 metres from your fellow human beings. However, the app requires people around to have Bluetooth turned on to notify you if they enter your vicinity. We want to come up with an idea of developing an AR tool which makes this process easier. Neither does it require you to install any kind of app nor does it need the people around you to have their phone on them.
What it does
It uses Camera to visualise a two-meter radius ring around you to help you maintain social distancing. The app superimposes the 2-metre or a 6.5 feet virtual ring on the viewfinder and moves with the user in the form of a circle. Users can open the Instagram Camera and go to this Filter to launch the AR tool. The interface is simple and is similar to the smartphone game PokemonGo which uses the perimeter augmented reality technology.
How I built it
I have built using Spark AR studio which has amazing tools which make it super simple to make 3d World Filter
Challenges I ran into
The major challenge I ran into is adjusting the scale to match the real-world distance of 2m. Later using the official documentation I was able to achieve it easily.
Accomplishments that I'm proud of
I learnt about various features of Spark AR Studio. Apart from this World Filter I also built a Quiz Game using Face Tracking which is really amazing.
What's next for AR Social Distancing
I wish to add a feature which warns the users whenever an object enters into the 2m circle
Built With
sparkar
Try it out
github.com
www.instagram.com |
10,008 | https://devpost.com/software/ocean-cleaning-force | Inspiration
Everyone loves marine life but we end up throwing lot of trash into ocean that affects the beauty of the nature.We wanted to promote cleaning up the ocean as a first person view . As everyone love FPS games we wanted to promote cleaning using that technique .
What it does
Our filter is a simple game where the user places the land on the floor and then they walk around underwater and pickup trash by tapping on them .The people will walk around the water and collect trash and see the marine life in close.
How we built it
We used 3d max for modelling and adobe suite for texturing . We used less models and we build the map where the user can walk around like an open world using the models.
We make a simple marine world where they can see fish , marine plants and stuff as they collect the trash .we used plane tracking to place the world so the user can walk around .
Challenges we ran into
Making all the elements come under the file size and reducing the size without affecting the quality of them.
When making the game mechanics the rope was difficult at first and when testing the plane tracking was bit buggy as we had normal devices where the tracking was not great.
Accomplishments that we're proud of
we created an open world with in 4 mb size and we made a First person trash collector game with less size.
What we learned
How to optimise stuff for the low size and high quality
What's next for Ocean Cleaning Force
we can make a task based world first person where we can compete for how's cleaning more .Add more life forms into it so it will be a good experience for those who try the filter.
Built With
3ds-max
blender
sparkar
Try it out
www.instagram.com |
10,008 | https://devpost.com/software/digitruck-more-pixels-than-a-real-truck | PITCH
Configure the world’s most advanced truck with DIGITRUCK - More Pixels than a Real Truck. DIGITRUCK’s mere existence causes the world to DIGITIZE as well. Available in several brilliant colors.
INSPIRATION
In simple terms, I can be described as a car guy; I like cars, automotive design, technology, and generally things that go fast and look good doing so. I am also a bit of a designer and artist myself, I am a Senior 3D Artist for a design and marketing consultancy studio in Portland, Maine. It is my job to design and visualize various computer-generated visual elements for a number of automotive, healthcare/medical, and technology companies around the world (still/motion animations, videos, interactive experiences, custom apps, AR and VR experiences, etc.). I like what I do and it is that which served as my primary inspiration for this project - what would a client think was cool?
Spark AR is a unique piece of software that opens up a wealth of interesting marketing opportunities to various brands with products to display to the world in creative ways. I wanted to take a compelling product (in this case, the thought-provoking Tesla Cybertruck) and integrate it in some way into a fun and approachable/sharable augmented reality experience.
But here's the challenge - there is a limit to what can be effectively done in Spark AR. Under the constraints of this particular hackathon, the effect must be publishable to Instagram, this means a hard project size limit of 4MB. I know 3D data. I have painstakingly crammed a relatively-detailed automotive effect down to Spark AR's Facebook limit of 10MB but getting down to 4MB is likely out of the question for what I had in mind - if I was going photorealistic with my integration of the vehicle. So, why not avoid photorealism entirely. What options might that open up? If for the sake of the data limitations I can not reasonably make my subject photorealistic, why not make reality non-photorealistic?
I do not recall exactly how I got on this train-of-thought, but the Nintendo Game Boy was great. The original Game Boy had a tiny dim 8-bit display; the Game Boy Color after it had something like 15-bit color. I am a child of the 90s, I love Japanese culture and technology, so it was only natural that I gravitated towards Pokemon games and played just about every one until life and responsibilities got in the way of continuing that habit. That being said, the immersion that came with even a tiny dim screen like those Game Boys, playing Pokemon Yellow day in and day out, it was great and had a definite feel to it; I wanted to employ a bit of that for this effect.
Back to the subject for a bit, the product, the truck. My typical product/car configurator is simple but effective. The user gets an interactive view of the product. They can orbit around it, zoom it, look at it. Depending on the product there may be multiple finishes or colors and other bits to swap out. Because the experience is interactive and not just a picture, the user can view this changes immediately and in the context of what they are already viewing. The impending Tesla Cybertruck is slated to come from factory in only one color - bare steel. I am ignoring that but I will get back to that in a minute.
Although unreleased at the moment, the Cybertruck is already an iconic design; it is immediately recognizable even when disguised, distorted, or modified. However, because I want to get away from using exact branding I chose to use a bit of modified data for my DIGITRUCK.
So, to define scope, this experience will:
Feature a pixelated rendition of a vehicle (Tesla Cybertruck modified to DIGITRUCK)
The pixelated vehicle will be augmented into a pixelated view from the user's/device's rear-facing camera (the pixelated truck lives in the user's pixelated reality)
Retro/Game Boy reminiscent graphics
Some basic customization of the vehicle - probably color
Sketchfab to the rescue - I found a usable model of an already low-fi/pixelated Tesla Cybertruck. I came across it when I was looking into what vehicle I wanted to use and was part of the reason I chose the Cybertruck in the first place. I am used to creating photoreal/manufacture-ready CAD models from scratch but something this style is not my forte; having this Sketchfab model to start off with was a great help. I downloaded the model and imported it into Autodesk Maya for some quick modification and clean up. The textures that came with it where modified a bit as well for the sake of integrating into real-world color values.
Model done, I headed back into Spark AR to begin prototyping the pixellation effect that would be applied to the real world. I am far from a Spark AR pro - but I have made some things when testing effects on behalf of my studio as well as for my general curiosity. I do have one basic published effect that I created for a local cafe and ice cream shop. I was not aware that Spark AR has a library of sorts that has a lot of add-ins available to just drop into any Spark AR project. The shaders in this library were particularly interesting and gave me just what I needed to Game Boy-ify the world. I am essentially combining the TriTone Shader and the Pixelate Shader to replace the real-world view - making the view from the camera appear pixelated and mimicking low-bit color. From there I imported my final DIGITRUCK model. I am using the basic world-space effect controls to place and manipulate the position, orientation, and size of the digital asset. I am also using a quick system of randomizers attached to another Pixelate Shader to manipulate the DIGITRUCK's texture. The effect that this has helps it integrate a bit better into the pixelated real-world, which inherently has a bit of flicker that the truck's flat texture did not have. Interaction wise, the user can tap the screen to place the DIGITRUCK wherever the plane tracker finds a surface. Two-finger-twist will rotate the truck, and pinching will scale it. Simple. The rest is UI/UX.
I wanted really integrate this digitization thing so I decided to make the UI as reminiscent of an old 8-bit game as I could. Using Spark AR's ability to use custom fonts, I imported a nice digital/pixelated font I had and used it throughout the UI. The next trick was getting the color picker working. Those early Pokemon games I referenced earlier used a cool device for naming the various towns the player would come across; they were named after colors: Cerulean City, Lavender Town, Cinnabar Island, etc. Because of the low color depth achievable on Game Boys at the time, these games utilized color to make these locals impactful and immersive. Cerulean City was largely blue, Lavender Town was lavender-colored, Cinnabar Island took on the reddish-orange hue of a piece of mercury-sulfide (AKA Cinnabarite). I wanted to sample these hues directly from the game and use them to tint my augmented world and define my vehicle's color as well (like it was a car paint color). Thus, Cerulean City becomes Cerulean Tintcoat. I am using the new Picker UI patch to control the color selection. While it is great to have a picker like this built directly into the patch editor, I would love to have the option to increase the number of inputs at my discretion (make it more than 10 if I want). Also, having a picker UI-style patch that accepted other values aside from textures (like text) would have been a great addition as well. For the time being, I am using a system of "Equals Exactly" patches connected to a series of "If Then Else" nodes that checks the Index from the Picker UI to display on the UI the current color selection. It works. The only other thing to add was a simple toggle-able "Specs" screen for a bit of wit and interaction.
In the end, all of this comes together as a witty marketing piece for a largely fictional product - the DIGITRUCK. It was fun and interesting to work on and I have learned a great deal more about Spark AR and hope to continue using in the future as it continues to grow and develop into a serious content development tool.
Built With
maya
photoshop
sketchfab
sparkar
Try it out
www.instagram.com |
10,008 | https://devpost.com/software/gratitude | Low Light
Inspiration
Gratitude for front-line workers
What it does
It creates a rotating globe, which one can set-up with whatever background they want to express their gratitude to the front-line workers with the comfort of their home.
How I built it
The Globe is a 3d model I found on open source platforms, and I created it as a base 3d object in spark AR and set it to rotate infinitely. Then there is an emitter with customized material that is set to take 1000 births and at a vertical angel which makes it feel like something running towards the globe.
Challenges I ran into
Light sources, fewer tutorials on Spark AR, time management
Accomplishments that I'm proud of
Very first try on AR and publish an effect on Instagram
What I learned
Spark AR, Sime basic 3d modeling using Blender, and how to create a good video.
What's next for Gratitude
Adding 3d models of front-line workers
Built With
blender
sparkar
Try it out
github.com
www.instagram.com |
10,008 | https://devpost.com/software/minesweeper-ar-4omxj2 | I'm in love with old games and always wanted to re-make some for Instagram in Spark AR. Though a lot of people are not aware of real retro games, I decided to make something almost everyone knows — the Minesweeper game from Windows.
Concept and gameplay
This is a classic logic game with a simple strategy to understand, yet sometimes hard to win. Every opened cell in the field is empty or numbered. The number shows how many mines are around the cell. A player has to analyze the numbers, determine where are the mines, and flag the cells. Sometimes a player has to open cells it randomly cause it's impossible to determine. The UI allows to move, rotate, and scale the field. A single tap opens a cell, long tap flags.
Code
The game is build using 99% pure javascript and only a small amount of patches. During the development of this game, I made a simple animation library for Spark AR and a fast object/material/texture loader to avoid using promises and make the code simpler.
What's next?
The next step will be harder. I'm planning to build Minesweeper as a cubical surface and put a player inside it or just leave a cube on a plane tracker.
Built With
javascript
particle
Try it out
www.instagram.com
rossiev.pro |
10,008 | https://devpost.com/software/arabrun | smoke
Inspiration- Multiplayer interactive games
What it does- Let's 2 people compete in a race just by blinking their eyes
How we built it- Using animation sequences and script
Accomplishments that we're proud of - Being able to tackle randomization
What we learned- how to develop interactive games
What's next for ArabRun- Make it more interactive and appealing to the users
Built With
animationsequences
javascript
Try it out
drive.google.com |
10,008 | https://devpost.com/software/exppace-ar | I've always wanted to blend my love for astronomy with technology in some way and surely this is the main inspiration behind this work
This ExPace AR let's you explore our solar system up close with different angles. It also contains few asteroids.
I built it using spark AR studio on windows. I've created 3d models of planets, sun, rings and asteroids using Blender and I've also added few bits of Javascript to control their rotations.
I'd say the main challenge for me was to control their rotations. I took me a while to get it done.
I finally made it to work with each planet having their specific rotations, that's a huge accomplishment for me. Yay!
It was the first time I used Spark AR studio plus I'm also not very good at blender so yeah, I learned how you can make amazing stuffs with spark AR. Apart from learning, I also got to know some awesome people from the community and got to learn a lot from them.
I started this work with a thought to allow user to go to different planets and explore then even up closer. Maybe options to go to different solar systems. But I think these are quite far away for me now. My next work would be to allow users to explore individual planets.
Built With
blender
javascript
sparkar
Try it out
www.github.com |
10,008 | https://devpost.com/software/cat-efects |
window.fbAsyncInit = function() {
FB.init({
appId : 115745995110194,
xfbml : true,
version : 'v3.3'
});
// Get Embedded Video Player API Instance
FB.Event.subscribe('xfbml.ready', function(msg) {
if (msg.type === 'video') {
// force a resize of the carousel
setTimeout(
function() {
$('[data-slick]').slick("setPosition")
}, 2500
)
}
});
};
(function (d, s, id) {
var js, fjs = d.getElementsByTagName(s)[0];
if (d.getElementById(id)) return;
js = d.createElement(s);
js.id = id;
js.src = "https://connect.facebook.net/en_US/sdk.js";
fjs.parentNode.insertBefore(js, fjs);
}(document, 'script', 'facebook-jssdk'));
Cat Effects
Inspiration Cats and quarentine
What it does Put a lot of cats on the world
How I built it with spark ar word effects
Challenges I ran into I create something new
Accomplishments that I'm proud of draw the cat
What I learned world effects i never do something like that
What's next for Cat Efects pets effects on his head
Built With
photoshop
sparkar
Try it out
www.instagram.com |
10,008 | https://devpost.com/software/black-lives-matter-visualization | Housing Inequality
Candidate Funding
Injustice
patch editor
Inspiration
We rewatched a TED talk _ Beauty of Data Visualization _ highlighting how turning complex data sets into simple diagrams tease out unseen patterns and connections. Good design, the speaker suggests, is the best way to navigate information glut – and may just change the way we see the world. This
inspired
us to take a stab in showcasing inequality data with immersive visuals that help people perceive how big the discrepancy really is. Since we've been exploring AR recently, this seemed like a good place to start.
What It Does
This AR world effect visually demonstrates how black people face extreme inequality in multiple areas of life. We specifically focused on housing, police violence, and politics.
How I built it
We built this AR effect with the 3D sticker template provided by Spark AR. We also downloaded some free 3d assets from Sketchfab and designed the written pieces using photoshop. The template already provided a helpful JavaScript portion, but we added extra interactivity through the patch editor. We specifically used the animation patch to create transitions that help make the visuals more immersive and communicate the inequality data more vividly.
Challenges I ran into
A
big challenge
was visually depicting a 1000% difference on screen. We purposefully decided to move away from traditional best practices and leave it as is (users can still manipulate size by pinching) so people can truly visualize how ridiculous this difference is.
Accomplishments that I'm proud of
Being able to contribute to a powerful discussion during quarantine. I was very pleased to have the filter featured in some blogs and articles.
What I learned
During our research we learned about some shocking data points. During the process of creating the visuals, we were able to learn many things about creating AR effects in Spark AR Studio.
What's next for Black Lives Matter Visualization
Moving forward, we plan to keep updating the visuals with more data that highlights inequality and moves us to act from changing our behaviors and speaking up to donating and contributing to minority owned businesses.
Built With
animationpatch
blocks
javascript
patch-editor
photoshop
plane-tracker
sketchfab
sparkar
Try it out
www.instagram.com
www.instagram.com |
10,008 | https://devpost.com/software/greenar-one-small-step-to-a-greenar-world | .
Built With
sparkar
Try it out
github.com |
10,008 | https://devpost.com/software/jokerface | my2021vacation
Inspiration- I am a travelling lover and this is year was going to be the year for travelling as much as i could.
What it does- Randomly let's you decide where you could travel in 2021
How I built it- using Spark AR
Challenges I ran into- It was a little difficult to randomize the pictures
Accomplishments that I'm proud of- I made it into a filter I wanted it to be.
What I learned- Hardwork pays off.
Built With
javascript
Try it out
www.facebook.com
www.instagram.com |
10,008 | https://devpost.com/software/global-warming-or-climate-change | Filter image
Inspiration
Nowadays Global warming is on peek nobody is looking into it. It is a great honor for me that Facebook gave me a great platform to spread awareness.
What it does
It is just a simple awareness filter where a polar bear is saying that his house is melting please save it.
How I built it
I used a 3D model and Spark AR to built this awareness filter.
Challenges I ran into
It was difficult to build the 3d model.
Accomplishments that I'm proud of
I am proud of this platform that it gave me a chance to spread the social good and I am successful in spreading UN sustainable goal awareness.
What I learned
I learned different filters to built and publish it publically.
What's next for Global Warming or climate change.
I will make more filters on global warming and climate change.
Built With
spark-ar
Try it out
www.facebook.com |
10,008 | https://devpost.com/software/feel-the-nature-effect-id-555160601835287 | Inspiration
During this lock down nobody can go out and enjoy the nature. But with this effect you can share your pictures in healthy and refreshing forest at the background and make memories.
What it does
Everyone loves the pictures at a exotic jungle, river flowing in the background, butterflies roaming around and cool breeze. This effect does the same. You feel like taking your picture in a relaxing environment.
How I built it
We used Spark AR and published it on SparkAR Hub. Its an exciting no code AR building platform. which is perfect for AR beginners.
Challenges I ran into
The size of the project. If we add sounds it become heavy so I removed the sound. But if the water fall sound is given in the background it will make your videos perfect for feeling the forest and its beauty.
Accomplishments that I'm proud of
I accomplished the effect itself and used the AR to reduce the lock down feeling and the stress it is causing. Now people can share their pictures with a refreshing environment even sitting at their home safely.
What I learned
I learned Spark AR and explored its features from the guides and community at facebook.
What's next for Feel the Nature Effect ID=555160601835287
I will make few more exciting and thrilling outdoor effects so that people do not feel stuck in their homes and enjoy nature at home.
Built With
sparkarhub
sparrkar
Try it out
github.com |
10,008 | https://devpost.com/software/healthy-smile-effect-id-234131754390394 | Inspiration
To appreciate the smile and spread it.
What it does
It takes your facial expression and tell you which expression suits you the most. It takes happy / Sad/ Surprise / Relax gestures and appreciate the happy one to make you smile longer and kick the stress out.
How I built it
We used Spark AR and published it on SparkAR Hub. This is very great platform for low coders to create augmented realities.
Challenges I ran into
The file size support is very low. I embedded sound but due to its large size I have to remove them.
Accomplishments that I'm proud of
Making people smile for long which will release the stress they have and reduce depression and anxiety especially introverts who can not even smile openly. but with the help of this filter they can also feel the appreciation and love.
What I learned
We have to give try to everything. Spar AR was new for me but with the tutorials I learned alot and created this simple smiling effect.
What's next for Healthy Smile Effect ID: 234131754390394
I will try to improve it by introducing more characters instead of smiley. Especially for kids who love some cartoon super hero. So that they can feel special when they get appreciation from their loved/favorite ones.
Built With
sparkar
sparkarhub
Try it out
www.facebook.com |
10,008 | https://devpost.com/software/black-lives-matter | Using the filter on 4 people
Hey! That's me!
Inspiration
Seeing the current state of the world in 2020, it was so saddening to see a major racism crisis arrive in the USA due to the death of George Floyd. Seeing the riots and the protests, I was very moved and wanted to show my support for the
Black Lives Matter
cause. Thus, the will to support the cause and stand against this inhuman act acted as an inspiration for me to create a filter that educates people about the cause and lets everyone participate in the cause without even leaving their homes in this pandemic.
What it does
This is a face filter that displays text like
Black Lives Matter
,
Power to People
, and more on the faces of people. The filter works on up to four people and shows different supportive text for different people. I also added colour-changing hearts which work as a symbol of love and support for the cause and the affected people. There is a ColourLUT filter which boosts the colour of the scene along with a dust filter which makes the scene a pretty sight to behold. The overall idea of this filter is to support the cause along with making the process creative and colourful.
How I built it
I built this filter primarily using SparkAR. The SparkAR platform is very friendly and feature-rich for my needs. I also used Photoshop and Illustrator for my textures and assets for the filter. I would thank various YouTubers who helped me in making this filter as their tutorials were great and also the Facebook Developer Community for their support and guidance.
Challenges I ran into
Well, there were quite a few challenges I ran into. Being a beginner who was still exploring SparkAR, the idea of making this filter was already puzzling me. The biggest challenge for me was to animate the text and also set the appropriate size as the text was disappearing a lot of times even if I backed off a little. When I got my assets ready, It was confusing for me to assemble everything together. A lot of the times the Patch Editor threw an error and I spent hours finding the culprit. Nonetheless, I enjoyed these challenges and didn't gave up. I wanted to make this filter and there was nothing that could stop me.
Accomplishments that I'm proud of
I am really proud that I was able to make such a good filter all by myself! All the hard work and planning I spent was worth it in the end. That sense of accomplishment when I overcame the challenges that stopped me was amazing!
What I learned
I learned a lot of things in this hackathon, specifically:-
How to manage your time in the hackathon
How to come up with ideas that suited your capabilities and to work on those ideas
I learned a lot about SparkAR and its capabilities
My creative skills improved thanks to this hackathon
I learned how to ask for help and search for solutions
## What's next for Black Lives Matter
As of right now, I don't have future ideas for this filter, but this serves as a base for creating more filters based on social causes. I would love to cater to the betterment of the world through the medium of AR and give my best shot at the future filters I make.
Built With
adobe-illustrator
photoshop
sparkar
Try it out
www.instagram.com
github.com |
10,008 | https://devpost.com/software/it-s-called-me | icon
Inspiration
I got inspired by LGBTQ+ Flag, and my love for gliters and shiny stuff.
What it does
Basically it is shiny lights effect to the screen
How I built it
I duplicated a plane to 64 pecies through program called " Blender" and then imported it to Spark AR and fit the screen hiegh and weidth then added flat material used "Add" Blend mode to get the heavenly shiny lights and after that i used patch editor to import Gradient and colored the light
Challenges I ran into
to fit more than 1 color in the screen and blend it
Accomplishments that I'm proud of
I'm proud of being able and close to create what i had in my mind and thank for spark ar
What I learned
I have learned more about Blend mode and patches
What's next for ME
the next for ME i might wanna see it only for perosn not for the full screen
Built With
sparkar
Try it out
www.instagram.com |
10,008 | https://devpost.com/software/ocean-cleanup | Ocean Cleanup
Inspiration
This filter was made to bring awareness to man's impact on our oceans. Our trash is not only killing our reefs but killing the wildlife that live and feed off it! Change the world by changing mindsets!
What it does
This filter brings the bottom of the Ocean into your space. The filter opens up to a reef that is polluted and it is your job to clean up the mess.
Tap on the garbage to cleanup and watch the coral and environment come to life.
How I built it
This was built using SparkAR, 3ds Max and the Adobe suite.
Challenges I ran into
The biggest challenge was getting all the 3d objects in to make it feel alive. With only 4mb to work with we managed to fit a environment with a bunch range of animated characters. Using the patch editor to bring the interactive side but also use it to fade objects in and out.
Accomplishments that I'm proud of
I am proud that we managed to make a beautiful effect and be able to make something for a good cause. Hopefully sharing this effect will help change mindsets.
What I learned
I have learned that you can make something interactive and engaging out of serious topics. I have learnt that this medium can be a powerful tool to bring awareness.
What's next for Ocean Cleanup
Hopefully we can get a organisation to allow us to publish it on their page and influence their community. Would be amazing to get people to donate to them through our interactive experience.
Built With
3ds-max
photoshop
sparkar
Try it out
www.instagram.com |
10,008 | https://devpost.com/software/malsori | Inspiration
wanted to looks like my grandfather
What it does
This is a simple effect, making you looks like a real Albanian traditional highlander man
How I built it
spark ar
Challenges I ran into
not good in 3d modelling
Accomplishments that I'm proud of
learned a new tool
What I learned
spark is easy to use
What's next for Malsori
making it look better, maybe with better 3d models
Built With
particle
Try it out
www.facebook.com |
10,008 | https://devpost.com/software/halohex-choir | Halohex premise
What is halohex?
Halohex versions
Object sound profile
Create phase and effects list
Mix phase
Screenshots: Indoor
Screenshots: Outdoor
Screenshots: Outdoor
halohex by NEOKLAR
www.neokl.net
| Instruction Manual FREE download
Inspiration
As a move to curb the pandemic, quarantine measures have been applied globally. There is a major shift in working culture around the world where more companies are now seriously looking at remote working as the way to move forward as a society. People are finding themselves with a lot of spare time at home- which used to be filled with working at the office & commuting. Calm isolation can turn into anxious energy if the mind isn't stimulated, and when boredem sets in, it can take a toll on ones mental and physical health. It is a global issue and people are searching for more creative outlets to relieve them of stress- which will then directly increase work productivity.
What we do here at NEOKLAR involves design and music. We're a team of designers/musicians that work a lot on projects that combine the two elements together. So it was natural for us to create an audio visual experience that can help stimulate the creativity of people at home. Halohex was the result of this journey.
The story concept of Halohex
These objects appeared in our world out of thin air. Soon every human being on earth found themselves with these objects in their homes. The sounds emitting from these objects... are ethereal yet familiar. We're all trying to find answers to what these things are. Can we make contact?
What does it do?
Halohex is an AR sequencer that mixes abstract art with sound design.
Players use the world facing camera to view and manipulate these ethereal objects into sculptures. These objects emit sounds that can be manipulated through the act of positioning it or scaling its size.
How we built it
Halohex can't be achieved without the input of all our team members. Halohex is built with patch editor with zero javascript scripting/coding.
Arma Badrulsham: Creative direction, character design & modelling (Blender/ Illustrator/ Photoshop/ Premiere)
Ikram Hakim: User interaction & play experience (Touch Designer/ Spark AR)
Dylan Lee: Sound design (Logic-Pro X)
We first started an ideation meeting discussing what we wanted to achieve for our first halohex sequencer, halohex-CHOIR. Ikram found out that spark AR is able to enhance our sound design with the use of its sound effects feature. Our priorities were to maximise creative play & user intuitiveness within the scope of spark AR.
The project started out with Arma drafting out the story and a core concept of halohex-CHOIR. Once a solid backstory is created, he designs the objects and then models it on blender. Low-poly shapes were preferred to fit the filter size limitations. Halohex-CHOIR was picked as the name for our first sequencer to reflect the mysterious backstory, curvy fluid shapes and ethereal sound design.
Arma works in tandem with Dylan- feeding object images for him to create a sound design that fits the world of halohex-CHOIR. He crafted a warm ambient sound design that enhances the ethereal mood and story.
When the visual and sound designs are completed, the information is then passed to Ikram which worked on the user interaction and interactive play of halohex. Along the way, everyone tests the game and provides input to better the experience. Lastly we shared it to friends to get feedback on halohex-CHOIR before finalizing all its systems and publishing it.
Teaser and trailer videos are created by Arma to be shared on our instagram and other platforms to make our followers aware of halohex-CHOIR and its release date.
Challenges we ran into
There are few challenges that we ran into while creating halohex.
There was a bug that appeared where the objects were not "stackable" which was fixed by Ikram with some simple logic solutions.
Some sound effect changes were not audible, so we had to understand the effects and design sounds that will have a significant audible change when effects are applied. Ikram suggested that we add a volume modifier by scaling the object size- this opens up simple mixing possibilities in the sound loop.
Making sense of the orientation of the objects by adding a subtle grid overlay as a main plane indicator.
Adding a UI button to lock/unlock the plane was a game changer. Something that we thought was trivial and an afterthought decision, proved to be the one thing that changed the whole experience. The plane lock system made the game much more manageable and organized whilst still keeping true to the experience of experimental gameplay for the user.
"The challenge was packing a bunch of interactions to be as intuitive as possible- from making the object more visible when sound is played by using a subtle animation, to opening up the game with an intro. Placing everything in the right place was important." - Ikram Hakim
Accomplishments that we're proud of
We're proud of how the project grew organically. This was our first time working together on something like this- using a filter as a sequencer. All of us make music on electronic controllers and analogue sequencers, so we're very pleased to see that we've created something like this. We feel that we've created something special that can possibly grow into a community that uses Halohex as a game or as an instrument.
What we learned
We've always believed that when approaching methods for a project, it should serve to better the end product. The experience of creating Halohex has definitely strengthened this philosophy. We view Spark AR as a tool to enhance our Audio and Visual products as a team and how other softwares can work in tandem with Spark AR.
What's next for Halohex
Halohex-CHOIR is the first sequencer in the halohex sequencer series. We are aiming to release 5 more halohex sequencers in the coming months. Each sequencer will have its own unique character and sound design. We're also working with another team member, Max Jala, on converting the Halohex series into its own app.
Built With
blender
logicpro
patcheditor
sparkar
Try it out
www.instagram.com |
10,008 | https://devpost.com/software/levitating-island | Inspiration
A world full of amazing opportunities. Where any bold fantasies are embodied in reality, where the laws of physics will not rule over reality, the world that children and adults dream every day. The world of dreams.
What it does
This effect is an extension to the surrounding reality through instagram camera
How I built it
To create this effect, I used: Spark AR, 3Ds max, Photoshop.
In 3Ds max, I created a model. In Photoshop painted. Spark AR put together an iterative scene. To work with Spark AR tools used in this work, enough materials provided by Facebook.
You can see the project files at the
link
Challenges I ran into
I wanted to create an image familiar to everyone that matches my inspiration. The search for this solution was the most difficult part of the work. This effect should resonate with a wide audience. My search resulted in this work.
Accomplishments that I'm proud of
I was able to make the effect of augmented reality. It was very nice to see how my work came to life in an exciting setting.
What I learned
I had no experience with Spark AR, and I had experience with Unity. This technology was very interesting and open to the eyes, which will lead us to further develop our development experience.
What's next for Levitating Island
I won’t lie. If this work wins, I would like to spend the prize on creating a full-fledged game with related topics. This work is only a direction where I would like to move.
Built With
3ds-max
photoshop
sparkar
Try it out
www.instagram.com |
10,008 | https://devpost.com/software/social-distancing-mp5hqb |
window.fbAsyncInit = function() {
FB.init({
appId : 115745995110194,
xfbml : true,
version : 'v3.3'
});
// Get Embedded Video Player API Instance
FB.Event.subscribe('xfbml.ready', function(msg) {
if (msg.type === 'video') {
// force a resize of the carousel
setTimeout(
function() {
$('[data-slick]').slick("setPosition")
}, 2500
)
}
});
};
(function (d, s, id) {
var js, fjs = d.getElementsByTagName(s)[0];
if (d.getElementById(id)) return;
js = d.createElement(s);
js.id = id;
js.src = "https://connect.facebook.net/en_US/sdk.js";
fjs.parentNode.insertBefore(js, fjs);
}(document, 'script', 'facebook-jssdk'));
Inspiration
Unlocking of countries is challenging in this pandemic as the cases are rising.
What it does
It helps the user to maintain the social distancing with some fun.
How we built it
Using Spark AR studio.
Challenges we ran into
Estimating distance of persons from the user.
Lowering the size of the app as much as possible.
Filter should be user-friendly and fun to use.
Accomplishments that we're proud of
Without using great algorithms we have done our job.
Goal is accomplished within few megabytes.
Filter can be used for fun even after the pandemic.
Attracting a large number of users, for country like India where low budget phones are used people cannot use the resources like Google's SODAR because of its ARCore requirements.
What we learned
A new way to explore the world using Spark AR.
What's next for Social distancing
May not be this filter, but this project made us think to create some great AR effects without using ARCore type dependencies.
Built With
sparkar
Try it out
github.com
www.instagram.com
www.facebook.com |
10,008 | https://devpost.com/software/myfacemaskapp | Prototype model 1 "Dany" pink opaque
Prototype model 1 "Dany" white opaque
AiRFace.it (App)
A new customizable, economic and ecological mask tailored to you
Millions of disposable masks are thrown out every day, which in addition to having a significant environmental impact on nature also have a high monetary expenditure.
To meet all these needs, the
AiRFace.it App
team has developed a project that allows the customization of the mask or the creation of transparent masks that can adapt to the features and therefore customizable by scanning the face in 3D.
This is
AiRFace.it App
, a mobile application that can be downloaded free on IOS and Android devices.
The scanning process is quick and easy: it happens through the use of the mobile phone camera.
In addition, the material used in the production of the masks, biodegradable and hypoallergenic allows a repeated use, being able to sterilize the washing at high temperatures .
The mask is therefore ergonomic and environmentally friendly, because you just need to change the filters.
The 3D model can also be printed from the comfort of the house without leaving. The advantage is that, being custom-built, the signs released by normal masks after hours of use will be reduced.
Considering the difficult and delicate situation we are experiencing,
AiRFace.it App
would ensure protection and prevention with zero impact on the environment and would meet individual needs.
It would be an optimal solution, without going out to buy it if you already have a 3D printer or otherwise we print and deliver directly to your home! The App is free as well as the availability of the various basic 3D models.
Ergonomic
My face mask with its 3D scanning process, allows you to create ergonomic masks suitable for every feature of your face. This custom adaptation will allow you to reduce the marks released by normal masks after hours of use.
3d Printer
The realization of the 3D masks allows anyone to create them independently.
In fact, a 3D printer and a mobile phone are enough to quickly create personalized templates based on the desired quantity.
Ecological
The material of these masks is absolutely ecological, a very important feature if you consider high consumption
daily, especially in
some sectors.
In fact, these masks can
be reused several times later
washing at high temperatures which allows sterilization.
Trasparent
One of the main features of AiRFace.it is transparency.
The idea of making these masks with a transparent material was born mainly from the awareness of the importance of lip reading for deaf people.
Privacy
For us your privacy comes first, that's why AiRFace.it complies with all Gdpr (
https://gdpr.eu
) regulations and no personal data will be transmitted.
The problem solved by the project
With AiRFace.it App we want to find the solution to the problem of the availability of personal protection devices for everyone, in fact thanks to this app anyone can print his mask, following our guidelines for the use of safe and biodegradable materials, with their own 3D printer,only if you are certificated member of our network to ensure the quality and safety of the mask produced according to all applicable regulations, so that you have a mask for you and your family that can last for all this complicated period that we are living.
The solution you bring to the table
AiRFace.it is installable for free on all iOS and Android devices that have compatibility requirements and in a simple automated way and can create your own 3D mask and send it directly to the 3d printer certificated
The impact of the solution on the crisis
The idea behind this project is to create a safe, cost-effective product that respects nature for us and future generations
The needs to continue the project
This project needs funds in order to create the best mask with the best materials and for this a significant investment in research and development in addition to wanting to make and print masks to those who do not have the opportunity to have their own 3D printer
The value of your post-crisis solutions
We believe a lot in this project because it is a valuable help to anyone who does not have the opportunity to always have a disposable mask and this mask can be reused even after this crisis (hopefully it ends as soon as possible) in any sector that needs personal protection devices
The AiR net blockchain
We are creating a network of certified makers to be able to print masks for doctors and nurses for free to thank them for their valuable help.
We are working with other startups to create a network of certified makers using the Ethereum blockchain to create smart contracts in order to be able to transparently verify the entire network.
We are creating a decentralized system that is based on blockchain eos to ensure the total transparency of certification of the materials used to print the masks.
Currently our team also collaborates in the creation of protective equipment for doctors and nurses to thank them for their difficult work.
Built With
blockchain
c#
html5
java
javascript
kotlin
objective-c
python
swift
Try it out
airface.it
bitbucket.org |
10,008 | https://devpost.com/software/holographaugment |
window.fbAsyncInit = function() {
FB.init({
appId : 115745995110194,
xfbml : true,
version : 'v3.3'
});
// Get Embedded Video Player API Instance
FB.Event.subscribe('xfbml.ready', function(msg) {
if (msg.type === 'video') {
// force a resize of the carousel
setTimeout(
function() {
$('[data-slick]').slick("setPosition")
}, 2500
)
}
});
};
(function (d, s, id) {
var js, fjs = d.getElementsByTagName(s)[0];
if (d.getElementById(id)) return;
js = d.createElement(s);
js.id = id;
js.src = "https://connect.facebook.net/en_US/sdk.js";
fjs.parentNode.insertBefore(js, fjs);
}(document, 'script', 'facebook-jssdk'));
An example of the effect.
Inspiration
Some people did an holographic or volumetric effect, this inspire me to do this effect for augment the hologram.
What it does
It represents an hologram of the world white lights, and when you tap "it augment".
How I built it
I build it on Spark AR Studio with an object did on blender.
Challenges I ran into
Was difficult to smooth the effect, when it have to augment, I did so much proofs with differents objects to try it.
Accomplishments that I'm proud of
I am really proud about this, because I can development more things about AR on Spark AR Studio.
What I learned
I learned to much about people who did this effect and the differents patchs that it uses.
What's next for HolographAugment
I don´t know it at moment, but sure It will be a cool effect if I can do the infinite loop, as an never ends loop.
Built With
sparkarstudio |
10,008 | https://devpost.com/software/saucy-pirates | x |
10,008 | https://devpost.com/software/insect-world | Inspiration
As seeing the Ar filters these days are being used very much so I tried to make a AR world effect where the surround will have random insects.
What it does
The effect emits out random insects
How I built it
I built it using spark ar
Challenges I ran into
mostly the time frame and material sizes
Accomplishments that I'm proud of
its my first effect which I had build I m Hppy I did it.
What I learned
learned to use the Spark AR for Instagram filters
What's next for Insect world
More AR effects are on the way to come.
Built With
sparkar
Try it out
www.instagram.com
github.com |
10,008 | https://devpost.com/software/covid-19-java-island-interactive-chart |
window.fbAsyncInit = function() {
FB.init({
appId : 115745995110194,
xfbml : true,
version : 'v3.3'
});
// Get Embedded Video Player API Instance
FB.Event.subscribe('xfbml.ready', function(msg) {
if (msg.type === 'video') {
// force a resize of the carousel
setTimeout(
function() {
$('[data-slick]').slick("setPosition")
}, 2500
)
}
});
};
(function (d, s, id) {
var js, fjs = d.getElementsByTagName(s)[0];
if (d.getElementById(id)) return;
js = d.createElement(s);
js.id = id;
js.src = "https://connect.facebook.net/en_US/sdk.js";
fjs.parentNode.insertBefore(js, fjs);
}(document, 'script', 'facebook-jssdk'));
Playable information Audio using english
Covid-19 Case in indonesia
Interactive Map Chart about Covid-19 in Jawa Island
Inspiration
I got this inspiration from one of the covid19 development statistics on the internet and tried to make it in the form of augmented reality on Instagram using spark ar
What it does
This ar will provide information related to the number of people who are confirmed, recovered and died of corona. I made 3 statistical forms, maps, bars and text charts
How I built it
I use Java island map and convert it to 3D using the blender application, then I put the 3D file into spark ar and give a lot of logic in the form of patches and javascript
Challenges I ran into
I have trouble adding a piece of content that can attract the user's attention, your input will be very meaningful for this project
Accomplishments that I'm proud of
I can make this AR very perfect although it's a shame spark ar doesn't have the feature to retrieve data on the internet so that it can get data accurately and in real time
What I learned
a lot, ranging from the development of covid19, chart form, and much more
What's next for Covid-19 Java Island Interactive Chart
I will make a distribution map that is much broader, the next target is to make a map of Indonesia and divided into provincial units
Built With
blender
javascript
spark-ar
sparkar
Try it out
github.com
www.instagram.com |
10,008 | https://devpost.com/software/the-virtual-gallery | one of the exhibits
view from coronal plane
exhibits
left lateral view
I'm a foreign student in Tbilisi and i feel lonely sometimes,sp every Saturday or whenever i can find time i go to a gallery called project artbeat. it gave me peace and sometimes inspired some of my paintings. When the lock-down started i realized i wouldn't be able to go there anymore, sadly. so i came with the idea of a virtual gallery and decided to work on it.
This
filter
simply projects a 3d gallery into the environment so people can walk in and view the pieces like they would at a normal gallery.
i used resources from the spark Ar app and library, then i went on Instagram and pitched the idea to every artist i could find, i put the works of those that replied me into the gallery.
Instagram has a 4mb filter limit, which means every piece of the filter had to be important. i had to drastically reduce the pixel density of each artwork without loosing quality. although i wish i could make the gallery more realistic, the space limit will not allow the use of realistic texture.
Nothing really, i think it's a simple idea that other people must have had. Although the
artists
seem to very excited about it, every time i look at it i always see something i can make better.
I learned how to prioritize detail without sacrificing user experience.
What's next for The virtual gallery.... when i post it, i want to pitch the idea to as many galleries as possible, it'd be amazing to see how beautiful it'd be if the Louvre could have one of this or if instagram itself had a monthly or weekly art exhibition itself, lol imagine how rad that would be it'll probably bring more artists and creatives to the platform. I also plan to keep reaching out to artists so i can update the gallery constantly and help them reach more people. who knows maybe someday, these artists might be able to sell their art through a virtual exhibition someday.
Built With
sparkar
Try it out
www.instagram.com |
10,008 | https://devpost.com/software/light-particles | Demonstration in Spark AR of the light emitters
Inspiration
Spark AR tutorials
What it does
Instantiates emitters with existence scope being only the field of the world camera
How I built it
Followed Spark Tutorials found at:
https://sparkar.facebook.com/ar-studio/learn/documentation/tutorials/using-the-gyrometer/
https://sparkar.facebook.com/ar-studio/learn/documentation/tutorials/using-particles
Challenges I ran into
Finding a way to optimise smartphone performance with emitter generation
Accomplishments that I'm proud of
Learning a new software tool
What I learned
How to manipulate textures and materials with Spark AR
What's next for Light Particles
Interactivity of the light emitters
Built With
particle
Try it out
github.com |
10,008 | https://devpost.com/software/the-human-skeleton | Inspiration
the desire that we have during studies to be able to observe for better learning has been a great source of inspiration for the realization of this work, just as we use 3D printing to build educational objects, in this project it is augmented reality that has been implemented to enhance learning.
What is he doing
it allows you to drop a 3d human squellete on flat surfaces of the real world through the camera, it gives the possibility to start and stop a rotation of the skeleton by pressing it, to display general information on the skeleton during the rotation and also to display information on certain distinct parts of the human skeleton by clicking on these parts when the rotation is stopped.
How I built it
I used Adobe for the design and creation of the textures, then I started with a model available in the libraries of spark Ar studio to start the project and I also consulted various educational resources of spark studio to finalize the work .
Challenges encountered
familiarize myself with Spark Ar Studio software, use the patch editor to add interaction to the filter, but ultimately I managed to get the job done thanks to the learning resources made available to developers
Achievements I'm proud of
now, i'm proud to have contributed to making augmented reality accessible to the community to learn more about the human skeleton and soon on medicine in general (the next project updates).
What I learned
I learned how to use Spark Ar Studio better, I learned how to use the patch editor in the software, I learned new things about the human skeleton and I continue to learn more about medicine for the rest of my project.
What is the next step for The Human Skeleton
in future updates, i will add other medical learning topics (brain system, heart system, respiratory system). I would like to update it constantly to add information so that this filter becomes an augmented reality medical book.
Built With
3dmodel
adobe
adobeaftereffects
photoshop
sparkarstudio
Try it out
instagram.com
github.com |
10,008 | https://devpost.com/software/water-1dja9n |
window.fbAsyncInit = function() {
FB.init({
appId : 115745995110194,
xfbml : true,
version : 'v3.3'
});
// Get Embedded Video Player API Instance
FB.Event.subscribe('xfbml.ready', function(msg) {
if (msg.type === 'video') {
// force a resize of the carousel
setTimeout(
function() {
$('[data-slick]').slick("setPosition")
}, 2500
)
}
});
};
(function (d, s, id) {
var js, fjs = d.getElementsByTagName(s)[0];
if (d.getElementById(id)) return;
js = d.createElement(s);
js.id = id;
js.src = "https://connect.facebook.net/en_US/sdk.js";
fjs.parentNode.insertBefore(js, fjs);
}(document, 'script', 'facebook-jssdk'));
water
Inspiration Johanna intagramer
What it does - Simulate water on your face
How I built it - Spark ar, after effects and photoshop
Challenges I ran into - It was a experimental test for me, because never did something like this, so I try a lot
Accomplishments that I'm proud of - Efforts every day to do something new
What I learned - all, put videos on the face and simulate reflection...
What's next for Water - improve more and try to do more realistic
Built With
after-effects
photoshop
spark-ar
Try it out
instagram.com |
10,008 | https://devpost.com/software/speak-with-me |
window.fbAsyncInit = function() {
FB.init({
appId : 115745995110194,
xfbml : true,
version : 'v3.3'
});
// Get Embedded Video Player API Instance
FB.Event.subscribe('xfbml.ready', function(msg) {
if (msg.type === 'video') {
// force a resize of the carousel
setTimeout(
function() {
$('[data-slick]').slick("setPosition")
}, 2500
)
}
});
};
(function (d, s, id) {
var js, fjs = d.getElementsByTagName(s)[0];
if (d.getElementById(id)) return;
js = d.createElement(s);
js.id = id;
js.src = "https://connect.facebook.net/en_US/sdk.js";
fjs.parentNode.insertBefore(js, fjs);
}(document, 'script', 'facebook-jssdk'));
Bocas
Inspiration - Johanna and a lot of instagramer filter
What it does - it take your mouth and replicate
How I built it - spark ar and photoshp
Challenges I ran into- learn how to take your mouth
Accomplishments that I'm proud of - learn every day something new to try
What I learned - to cut the mouth and put on the space world
What's next for Speak With me - I want to imporve more and work on creatives sytems, filters and make something new and suprising
Built With
photoshop
sparkar
Try it out
instagram.com |
10,008 | https://devpost.com/software/castme | Main Menu
Motion capture streaming demo
Female avatar professor teaching
Male Avatar professor teaching
presentation screen
view from the back
View from the middle
Customize Character
castme.life website
Splash Screen
Inspiration
Video lectures are present in abundance but the mocap data of those video lectures is 10 times ahead in the form of precise data. High quality and a large amount of data are one of the requirements of best argmax predicting ML models, so we have used here the mocap data. Despite the availability of such promising data, the problem of generating bone transforms from audio is extremely difficult, due in part to the technical challenge of mapping from a 1D signal to a 3D transform (translation, rotation, scale) float values, but also due to the fact that humans are extremely attuned to subtle details in expressing emotions; many previous attempts at simulating talking character have produced results that look uncanny( two company- neon, soul-machine). In addition to generating realistic results, this paper represents the first attempt to solve the audio speech to character bone transform prediction problem by analyzing a large corpus of mocap data of a single person. As such, it opens to the door to modeling other public figures, or any 3D character (through analyzing mocap data). Text to audio to bone transform, aside from being interesting purely from a scientific standpoint, has a range of important practical applications. The ability to generate high-quality textured 3D animated character from audio could significantly reduce the amount of bandwidth needed in video coding/transmission (which makes up a large percentage of current internet bandwidth). For hearing impaired people, animation synthesis from bone transform could enable lip-reading from over-the-phone audio. And digital humans are central to entertainment applications like movies special effects and games.
What it does
Some of the cutting edge technologies like ML and DL have solved many problems of our society with far more better accuracy than an ideal human can ever do. We are using this tech to enhance our learning procedure in the education system.
The problem with every university student is, they have to pay a big amount of money for continuing to study at any college, they have to interact with the lecturers and professors to keep getting better and better. We are solving the problem of money. Our solution to this problem is, we have created here an e-text data to human AR character sparse point mapping machine learning model to replace the professors and use our ai bots to teach the same thing in a far more intractable and intuitive way that can be ever dome with the professors. The students can learn even by themselves AR characters too.
How we built it
This project explores the opportunities of AI, deep learning for character animation, and control. Over the last 2 years, this project has become a modular and stable framework for data-driven character animation, including data processing, network training, and runtime control, developed in Unity3D / Unreal Engine-4/ Tensorflow / Pytorch. This project enables using neural networks for animating character locomotion, face sparse point movements, and character-scene interactions with objects and the environment. Further advances on this project will continue to be added to this pipeline.
Challenges we ran into
For Building, first of all, a studio kind of environment, we have to collect a bunch of equipment, software, and their requisites. Some of them have been listed following.
Mocap suite- SmartSuite Pro from
www.rokoko.com
- single: $2,495 + Extra Textile- $395
GPU + CPU - $5,000
Office premise – $ 2,000
Data preprocessing
Prerequisite software licenses- Unity3D, Unreal Engine-4.24, Maya, Motionbuilder
Model Building
AWS Sagemaker and AWS Lambda inferencing
Database Management System
Further, we started building.
Accomplishments that we're proud of
The thinking of joining a virtual class, hosting a class, having a realtime interaction with your colleagues, talking with him, asking questions, visualizing an augmented view of any equipment, and creating a solution is in itself is an accomplishment.
Some of the great features that we have added in here are:
Asking questions with your avatar professors,
having a discussion with your colleagues,
Learning at your own time with these avatars professors
and many more. some of the detailed descriptions have been given in the submitted files.
What we learned
This section can be entirely technical. All of the C++ and Blueprint part of a Multiplayer Game Development.
We have started developing some of the designs in MotionBuilder, previously we have been all using the Maya and Blender.
What's next for castme
1. We are looking for a tie-up with many colleges and universities. Some of the examples are Galgotiah University, Abdul Kalam Technical University (AKTU), IIT Roorkee, IIT Delhi.
2. Recording an abundance amount of the lecture motion capture data, for better training our (question-answering-motion capture data) machine learning model.
Try it out here:
Intro Demo (2 min):
https://youtu.be/Xm6KWg1YS3k
Complete Demo:
https://youtu.be/1h1ERaDKn6o
Download pipeline here:
https://www.castme.life/wp-content/uploads/2020/04/castme-life%20Win64%20v-2.1beta.zip
Documentation to use this pipeline:
https://www.castme.life/forums/topic/how-to-install-castme-life-win64-v-2-1beta/
Complete source code (1.44 GB):
https://drive.google.com/open?id=1GdTw9iONLywzPCoZbgekFFpZBLjJ3I1p
castme.life:
https://castme.life
More info
For more info on the project contact me here:
gerialworld@gmail.com
, +1626803601
Built With
blueprint
c++
php
python
pytorch
tensorflow
unreal-engine
wordpress
Try it out
castme.life
www.castme.life
github.com
www.castme.life |
10,008 | https://devpost.com/software/eyesqualizer |
window.fbAsyncInit = function() {
FB.init({
appId : 115745995110194,
xfbml : true,
version : 'v3.3'
});
// Get Embedded Video Player API Instance
FB.Event.subscribe('xfbml.ready', function(msg) {
if (msg.type === 'video') {
// force a resize of the carousel
setTimeout(
function() {
$('[data-slick]').slick("setPosition")
}, 2500
)
}
});
};
(function (d, s, id) {
var js, fjs = d.getElementsByTagName(s)[0];
if (d.getElementById(id)) return;
js = d.createElement(s);
js.id = id;
js.src = "https://connect.facebook.net/en_US/sdk.js";
fjs.parentNode.insertBefore(js, fjs);
}(document, 'script', 'facebook-jssdk'));
This is me using the effect saying: oooooo
this is my effect icon
Inspiration
I was inspired by the funny moments of my past effect, and by the people, because they share it, and they found it funny.
What it does
This is a simple effect, using a 3D object from library, and when you raise your eyebrows, the eyes scale their size more big, and when you talk or play music (recording a video with microphone), the eyes scale in one axes with music or audio rythm.
How I built it
I only use Spark Ar Studio, using objects from library, and using face-occluders to look it better. Then I use the patch editor to get position to the eyes, with transitions and implement the audio analizer with another transition.
Challenges I ran into
I need to put the eyes in a null object to give the position, and then I have to use the object in to give the values of the audio analyzer. Was a little difficult use the transitions to look it accord the idea.
Accomplishments that I'm proud of
I did more than 50 filters with differents features, and I learned so much from Spark Ar, I am really proud about me cause I found a way to show my 3D models on my filters, like my IronManHelmet filter (at moment my preffer filter), now I am learnig scripting to use it on Spark, but at moment I am rookie at this.
What I learned
I learned to use the audio analizer and more other features, betwen doing this project. Like the use of null objects to use 2 positions or 2 rotations or 2 scales in this case.
What's next for Eyesqualizer
This is an upgrade of my past filter, inspired by the people used that. Maybe could be implemented with another feature or another details on the future.
Built With
sparkarstudio
Try it out
instagram.com |
10,008 | https://devpost.com/software/the-golden-ratio | the title image
GIF
GIF
GIF
Inspiration
As a nature lover and Mathematics educator, I have always wanted to show the relevance of the subject to my students. One of the easiest way is through the Golden Ratio which can be derived from Fibonacci numbers as they tend towards infinity.
What it does
This instagram effect overlays a looping Fibonacci gif in the World Space and therefore allow the users to see how the Golden Ratio exists in their surroundings.
How I built it
I used Spark AR Studio and I converted gifs to sprites and cut them before putting them together at Animation Sequences for the software. Making the material semi-transparent helps its use-case.
Challenges I ran into
I had a bit of difficulty getting the multiple frames cut evenly and compressed enough for upload purposes.
Accomplishments that I'm proud of
I take pride in creating educational AR effects for IG/FB so that my students can learn better using the social media platforms that they are already used to.
What I learned
I learnt that AR has a lot of educational potential.
What's next for The Golden Ratio
I hope to create other educational Insta/FB effects.
Built With
sparkarstudio
Try it out
instagram.com
devpost.com |
10,008 | https://devpost.com/software/save-the-planet-8xqo04 | A Mock-up of how the effect looks on Instagram.
Using planes to position the objects present within the Spark AR platform.
Previewing the effect within the Spark AR platform.
Various patches used for making the effect in Spark AR, giving the effect logic and control.
Inspiration
The world as we know is changing drastically, and over the past fifty years, the average global temperature has increased at the fastest rate recorded in history. 2019 was ranked as the second hottest year on record for Earth and NASA also found that 2010-2019 was the hottest decade ever recorded. Global warming is leading to the melting of glaciers, rising sea levels, forests, farms, and cities are facing troublesome new pests, heat waves, heavy downpours, and increased flooding. The list doesn’t end here as there is an upsurge in medical conditions as well, like allergies, asthma, and infectious disease outbreaks. These are becoming more common due to increased growth of pollen-producing ragweed, higher levels of air pollution, and the spread of conditions favourable to pathogens and mosquitoes.
Therefore more than ever, it becomes imperative to raise awareness about the issue of global warming and climate change and what better way than Instagram to raise this issue. The reach that Instagram provides is unparalleled for any other social media platform because of the large number of users that Instagram has. I have seen first-hand results of how quickly AR effects that I had made for Instagram reach over 100,000 impressions within a month. Therefore, I truly believe that proper campaigning on Instagram about global warming through a world AR Effect would help in propagating this cause of much-needed attention.
What it does
This World AR Effect showcases a 3D model of planet earth burning over a fire, denoting the rise in global temperatures around the world. Since it’s a World AR effect, therefore, it is possible only to be viewed using the back camera. To use this effect one needs to place it over a flat surface and tap on it to fix its position and to start the sound of burning fire, at the same time particles are emitted when it's tapped denoting sparks of fire moving up towards the environment. To hide this effect one needs to long-press it which will result in all the objects to hide under the plane.
How I built it
To build this AR effect, Spark AR platform from Facebook was mostly used. Also, to make the 3D objects Microsoft Paint 3D along with Adobe Photoshop was used.
Challenges I ran into
The biggest challenge that I ran into was understanding how I can carry out the sound of the burning fire and the sparks that the fire produced in the most realistic way possible without increasing the maximum file size of 4mb allowed in the Spark AR platform. The resources available in the Facebook, Spark AR group helped me a lot.
Accomplishments that I'm proud of
The biggest accomplishment for me is that I was able to make something that can send out a positive message to everyone and people start to understand the effects of global warming. At the same time understanding how patches can be used in the best way to impart some logic into the effect was also something that I am proud of as it helped me understand Augmented Reality in a much deeper level.
What I learned
The biggest learning experience for me was understanding AR and the vast possibilities that it provides in the social media space and how we can use AR to propagate a cause for the betterment of humanity.
What's next for Save The Planet!
Since this effect is now live on Instagram, I am working to make another World AR effect that showcases how the melting glaciers are leading to a rise in sea levels around the world.
Built With
blender
photoshop
polygon
sparkar
Try it out
instagram.com |
10,008 | https://devpost.com/software/innovative-effects |
window.fbAsyncInit = function() {
FB.init({
appId : 115745995110194,
xfbml : true,
version : 'v3.3'
});
// Get Embedded Video Player API Instance
FB.Event.subscribe('xfbml.ready', function(msg) {
if (msg.type === 'video') {
// force a resize of the carousel
setTimeout(
function() {
$('[data-slick]').slick("setPosition")
}, 2500
)
}
});
};
(function (d, s, id) {
var js, fjs = d.getElementsByTagName(s)[0];
if (d.getElementById(id)) return;
js = d.createElement(s);
js.id = id;
js.src = "https://connect.facebook.net/en_US/sdk.js";
fjs.parentNode.insertBefore(js, fjs);
}(document, 'script', 'facebook-jssdk'));
Inspiration instagram effects, music.
What it does you randomly play a song
How I built it with Spark AR Studio
Challenges I ran into learn new concepts of Spark AR
Accomplishments that I'm proud of the independent background to the face, the particles, the random songs ...
What I learned
What's next for Innovative effects
Built With
ar
particle
photoshop
spain
Try it out
instagram.com |
10,008 | https://devpost.com/software/mask-tester-filter | Project_logo
project_screenshot
Inspiration
One day I was going through a news in TV regarding people trying out different masks in different shops and buying best one for them to after trying out the different mask combination. By this method a
problem arise in this COVID19 situation. As if a infected person of COVID19 try adopting this method and not purchasing the mask that he had tried out in the shop and if another person tries out the same mask in the same shop he might also get infected. So I created a solution to this challenging problem faced in the shops in this epidemic situation.
What it does
It is a filter which helps you to try out different types of masks virtually with the help of AR filter and helps you to buy best suitable masks from shops
How I built it
I used spark AR studio to create this filter and used adobe Photoshop to create mask images for using as a filter and I have used Figma which is a UI/UX design tool to create a rectangular selection bar which consist of different filter images of available masks and each mask images in the rectangular bar in the screen mainly corresponds to a button. In which if a user selects a button corresponds to a mask image then the mask corresponding to that button is shown in the users face. I have mainly inputted different mask images in users face as a face mesh in the form of a animation sequence and created a rectangular bar and created null objects in the mask images in the rectangular bar as button. And created a patch editor corresponding to each button. This patch editor mainly accepts in the rectangular bar buttons as input for mask images in it. And if a user clicks on particular mask image that image filter is shown in the users face.
Challenges I ran into
Here are the list of challenges that I ran into during this Hackathon.
One challenge was compressing the image of size of the images that I created for using the mask filter from the Photoshop. The size of the JPG images of mask was actually little bit large. So I had to compress it to smaller size.
Another challenge that i faced was regarding getting best suitable mask images for creating the filter as per the idea requirement.
## Accomplishments that I'm proud of
The proud thing is that I was able to learn spark AR and I was able to try out different tools in spark AR for implementing this project and I was able to provide a solution to a common problem faced by the society in this COVID19 situation and other important thing is that the idea that I had implemented using spark AR will be useful to large no of users and also for mask selling shop owners and also for mask purchases in E-commerce website like Amazon, Flip kart.
What I learned
Main thing is that I was able to learn the basics spark AR and I was also able to learn adobe Photoshop for creating images of masks for using in this spark AR project as a filter. And I was able to learn how to implement a idea to a solution which be helpful for a large amount of user during this COVID19 pandemic situation.
What's next for Mask tester filter
Making this filter available for to the users.
Making this filter available as a feature in E- commerce websites such as Flipkart, Amazon etc.
Reaching to public mask selling shop owners and telling them how this filter is helpful for their shops.
Built With
figma
photoshop
sparkar
Try it out
github.com
www.instagram.com
www.facebook.com |
10,008 | https://devpost.com/software/to-be-confirmed-6ezu0d | Inspiration
I have seen many amazing effects that are created by others on SparkAR. Hence, I wanted to create a fun and interactive game that could entertain users while they stay at home during this pandemic crisis. I started building my first blinking game on SparkAR for pride2020. Soon after my curiosity grew and that's when I've made this attempt to create an interactive world effect Whack A Mole Game. In addition, I wanted to test myself on how much I've learned, and what other things I could create.
What it does
Whack A Mole AR, simulates the concept of an arcade whack a mole game by using the back-facing camera and real-world effect. Upon commencement of the game, the player would be given 20 seconds to hit as many moles as possible.
How I built it
I build my low-poly moles, and mole holes by using the blender software.
Once that was done it was time to make my characters move up and down at a fixed position and to make it reset when it is tapped.
Following that, was to create a countdown timer, and a simple script to count the scores.
Once that was done, it was time to place the environment to cover up the moles. and to include audio.
Challenges I ran into
The scripting of the score only allows one input however, I have five moles that needed to be counted when tapped. I solve this by using the "Add" patch and "Value" patch for verification.
(i.e. (mole 1 += mole 2)(mole 3 += mole4), after grouping the scores together, I simply add the total together until I get 1 single output. which I then connect it to display.
I have never drawn or created anything using the blender software. I found it to be rather challenging.
Accomplishments that I'm proud of
I am extremely proud of this game, as Ii was able to make this into a working concept.
The fact that I was able to solve the scoring issue, on my own.
To be able to create low-poly characters by using blender.
What I learned
Practice helps to improve my understanding and to be able to think about the logic of something that I'm creating.
Patience, as frustrated as I get when I face an issue. I would always stop for a while, look back at the other effects that I have created, and figure out a way to solve the issue.
curiosity, always have a curious mind on how to create a new idea, or to simply and improve a certain way of doing things.
Learning new things, learning is part of our everyday life, it gives us experience, and helps to improve our skill.
What's next for Whack_A_Mole_AR
As this game is a simple one. I believe that there are many more things that could be done to further improve the game.
To create an option where players are allowed to make the selection of the characters before starting the game.
To make random characters to appear at a random location. each carrying different points.
To create a high scoreboard that displays other player's score.
Built With
blender
low-poly
photoshop
sparkar
Try it out
github.com
instagram.com |
10,008 | https://devpost.com/software/inha-university-logo |
window.fbAsyncInit = function() {
FB.init({
appId : 115745995110194,
xfbml : true,
version : 'v3.3'
});
// Get Embedded Video Player API Instance
FB.Event.subscribe('xfbml.ready', function(msg) {
if (msg.type === 'video') {
// force a resize of the carousel
setTimeout(
function() {
$('[data-slick]').slick("setPosition")
}, 2500
)
}
});
};
(function (d, s, id) {
var js, fjs = d.getElementsByTagName(s)[0];
if (d.getElementById(id)) return;
js = d.createElement(s);
js.id = id;
js.src = "https://connect.facebook.net/en_US/sdk.js";
fjs.parentNode.insertBefore(js, fjs);
}(document, 'script', 'facebook-jssdk'));
Inspiration
I was inspired my friends also creating AR masks for their own needs
What it does
My mask helps to distinguish IUT students from others
How I built it
I made just simple steps on Spark AR
Challenges I ran into
I created my mask following some tutorials on YouTube
Accomplishments that I'm proud of
What I learned
I learned how work on Spark AR
What's next for Inha University Logo
I am planning to make new masks with more complicated options |
10,008 | https://devpost.com/software/soaper-5-fight-covid-19 | Who is your inner Soaper 5 hero?
Title banner to link up to another related educational game and learning experience.
Inspiration
As an educator, I feel that it is important to impart lessons to the young through engaging and exciting ways. This Instagram effect will thus be able to educate the young to do their part to fight COVID-19.
What it does
An effect where a user will get a randomised Soaper 5 character which also reminds them on some ways to fight COIVD-19 just like a hero.
How I built it
I used Spark AR which I learnt just because of this hackathon. I looked for videos on YouTube to get an idea of how to start and then started to put together the different parts into this effect.
Challenges I ran into
It was not easy learning how to use a new studio but having dabbled in Unity3D previously the interface was relatively easy to use.
Accomplishments that I'm proud of
I was able to put together the Instagram effect by myself with some support with media files from the ministry of education.
What I learned
It is not that difficult to come up with a meaningful effect as long as some time and effort is put in as well as the incubation and thinking of creative ideas.
What's next for Soaper 5 Fight Covid-19
It is linked to a game at suhaimizs.itch.io/soaperfight or go.gov.sg/soaperfight and can therefore continue to engage the young or the young at heart through a game made using Gdevelop too. The game is also pending review as part of FB Instant game as well but may not be reviewed so soon with the new regulations on having business account.
https://instagram.com/a/r/?effect_id=269972187389902&ch=NjZhZDNlMmZkZTkzNjhmMWNhNzQ5MDQ4ZTUxYjk2Yjc%3D
is the test link and the tentative link being reviewed is here at
https://instagram.com/a/r/?effect_id=269972187389902
Built With
ar
gdevelop
particle
Try it out
instagram.com |
10,008 | https://devpost.com/software/ugly-face | Inspiration
I get inspired by seeing what other people like, I see they like an Augmented reality that can change their face to be unique.
What it does
This Ar filter changes the user's face
How I built it
I made it using Facebook's Ar studio spark, and used a little Photoshop software to make some GIFs for use on canvas and rectangle.
Challenges I ran into
Accomplishments that I'm proud of
What I learned
What's next for Ugly Face
Built With
photoshop
sparkar
Try it out
instagram.com |
10,008 | https://devpost.com/software/the-universe-of-pokemon |
window.fbAsyncInit = function() {
FB.init({
appId : 115745995110194,
xfbml : true,
version : 'v3.3'
});
// Get Embedded Video Player API Instance
FB.Event.subscribe('xfbml.ready', function(msg) {
if (msg.type === 'video') {
// force a resize of the carousel
setTimeout(
function() {
$('[data-slick]').slick("setPosition")
}, 2500
)
}
});
};
(function (d, s, id) {
var js, fjs = d.getElementsByTagName(s)[0];
if (d.getElementById(id)) return;
js = d.createElement(s);
js.id = id;
js.src = "https://connect.facebook.net/en_US/sdk.js";
fjs.parentNode.insertBefore(js, fjs);
}(document, 'script', 'facebook-jssdk'));
poke-bol
This project was created so that people would understand for themselves that pokemon live in a beautiful world in poke-bola.
I was faced with the problem of optimizing a 3D object. But I figured it out quickly. Thanks!
Built With
3ds-max
Try it out
instagram.com |
10,008 | https://devpost.com/software/you-are-loved |
window.fbAsyncInit = function() {
FB.init({
appId : 115745995110194,
xfbml : true,
version : 'v3.3'
});
// Get Embedded Video Player API Instance
FB.Event.subscribe('xfbml.ready', function(msg) {
if (msg.type === 'video') {
// force a resize of the carousel
setTimeout(
function() {
$('[data-slick]').slick("setPosition")
}, 2500
)
}
});
};
(function (d, s, id) {
var js, fjs = d.getElementsByTagName(s)[0];
if (d.getElementById(id)) return;
js = d.createElement(s);
js.id = id;
js.src = "https://connect.facebook.net/en_US/sdk.js";
fjs.parentNode.insertBefore(js, fjs);
}(document, 'script', 'facebook-jssdk'));
Example 1
Inspiration
I inspired from what we dealing with right now. We are dealing with Covid-19 and many workers lost their jobs and it does not impact for workers but all of us. So I inspired to make this World AR and make sure you are loved.
What it does
So this World AR is for all of us, so we can feel that we/you are loved too
How I built it
I build this World AR with Spark AR and use some Emoticon like Smiley that represents that we must happy any time, no matter how hard this life and also there is Love that represents that we/you are loved. So I mix all of that and make a conclusion that we must happy every day no matter hard this life and we also surround with a happy person and loving person
Challenges I ran into
The Challenges that I ran into this project is how to make it simple but somehow it makes people happy and feels that he/she is loved
Accomplishments that I'm proud of
I will proud when this World AR can make positive energy to everybody
What I learned
I learn about AR, how it works, and how it can make people happy. Right now I don't learn so much, but over time i will learn more
What's next for You Are Loved
Soon I will make some World AR effect again but with more tools and more creativity
Built With
emoticon
loved
particle
smiley
sparkar
worldar
Try it out
instagram.com
github.com |
10,008 | https://devpost.com/software/vee-ar | Inspiration
VEE - Virtual Environment EYE: To produce this IMMERSIVE EXPERIENCE, BIMaking has developed a seamless procedure with which a federated BIM model can be visualized and interrogated using game engines like Unreal Engine®. In a society where global collaboration is the new normal, AR allows the entire design team to work on the same project without travel from one place to another or take a flight to supervise the project, thus avoiding human carbon footprint. This could reduce the size of the offices and their energy consumption. With AR we have the possibility of creating, evaluating and seeing the project long before its realization; which will help to reduce the design and construction’s cost, optimizing the transport of materials and their disposal.
What it does
VEE - Virtual Environment EYE: To produce this IMMERSIVE EXPERIENCE, BIMaking has developed a seamless procedure with which a federated BIM model can be visualized and interrogated using game engines like Unreal Engine®. In a society where global collaboration is the new normal, AR allows the entire design team to work on the same project without travel from one place to another or take a flight to supervise the project, thus avoiding human carbon footprint. This could reduce the size of the offices and their energy consumption. With AR we have the possibility of creating, evaluating and seeing the project long before its realization; which will help to reduce the design and construction’s cost, optimizing the transport of materials and their disposal.
How I built it
Using BIM integrated design, Projects Tools and Unreal Engine
Challenges I ran into
Being invited to many manifestations to present the product
Accomplishments that I'm proud of
Developed and fully rescalable to new projects
What I learned
Time consuming procedures which can easily integrated in the future
What's next for VEE AR
Transform it as a B2C service for AEC industry
Built With
android
augmented-reality
oculus
unreal-engine
vr
windows |
10,008 | https://devpost.com/software/skeletons | Just a simple skull face wrapper
Built With
sparkar
Try it out
www.facebook.com |
10,009 | https://devpost.com/software/haylingo-your-new-language-practice-companion | Inspiration
Eager to learn and be fluent to acquire English language motivates me to make something that can help people like me can easily practice my target language.
What it does
Main Feature
HayBot
a fast way to practice your new language with a conversational AI bot.
HayFriend
connecting you with real people across the world who enthusiast to practice a new language too.
HayWord
Play guess the word game to enrich your vocabulary with the fun way.
Support Feature
Translate
you can translate by word or sentences powered by wit.ai language identifier, NLP and NLU.
Pronounciation
listen to how the word is pronounciate.
Quick Reply Feature in Conversation
Translate All
make it easier for you to understand the last chat text.
Change to Speech
a mode that made for accustomed you to listen to the new language.
How I built it
using FB messenger with node js as backend and typescript as languages, processing user input with wit.ai to determine their intent for translate and pronounce also to identify the user input language to assign the parent language of the user in MongoDB, and for translation service and text to speech, I used was and wordapi for hayword feature.
Challenges I ran into
send an audio file from aws polly to bot messenger.
bridging users.
-create a guess word game in messenger
Accomplishments that I'm proud of
can use wit.ai for the first time and integrate it with messenger
bridging users
send an audio text to speech to messenger
make a guess word game to enrich the user vocabulary.
What I learned
a lot of wit.ai and messenger api
be patient for facing stack and error during code
What's next for HayLingo! - Your New Language Practice Companion
iteration for getting product-market fit
expand to other new languages like Korean, Japanese, Spanish, French and etc
Built With
aws-translate
cleverscript
facebook-messenger
mongodb
node.js
polly
s3
typescript
wit.ai
wordapi |
10,009 | https://devpost.com/software/karinderya-bot | Inspiration
Introduction
There's nothing more essential to the Filipino food culture than the carinderia. It is the go-to place of every Filipino for homemade food on every occasion: from workers getting their lunch fix, to families taking out meals to eat at home. It is Pinoy-style fast food at its best.
The COVID-19 pandemic has hit the food business here in the Philippines, with the small local eateries suffering the most; the entire industry is projected to lose 80% of expected revenue as the pandemic rolls on. While food deliveries are on the rise, most carinderias are left out of the top digital platforms, not able to reap the convenience and benefits they provide.
Now, there's a way for these food stops to be part of the new normal. Carinderia PH aims to solve this problem by providing an easy-to-use digital marketplace for small local eateries and businesses to be able to serve their customers.
With Facebook Messenger as one of the most used messaging applications in the Philippines with over 78.5 million users, small food businesses started creating their own Facebook Pages to be able to reach their customers and sell their products, especially during the pandemic. Customers can quickly engage with the Facebook Messenger chatbot without the need to download or sign up for a separate app or service. They can also find all the restaurants within their area, check all information about the stalls such as products, reviews, ratings, and map locations.
Dashboard
Through a dashboard that is viewable from both the web and inside the Chatbot, carinderia owners or even food enthusiasts can set up a virtual eatery in a few minutes. With a few clicks, photos are uploaded, and menus with prices are made. Instantly, these products are automatically learned by artificial intelligence to show it whenever customers are particularly looking for it.
Inside the dashboard, owners can monitor visual analytics of sales, visitors, orders, ratings, reviews, and geographic reports to help them track profits and make adjustments or data-driven business decisions. Their outstanding performance is rewarded through free promotions across all the customers within the platform.
Geographic reports
- give stall owners a capability to see how their products are being patronized on certain areas that can be used for targeted marketing and business decisions.
Visitor Hit Rate
- metrics that show the trend of users as the stall goes live. This could be used by owners to schedule their promotions and see if their marketing efforts actually converts.
Feedback
- shows customers' satisfaction rates to help owners improve.
Customer and sales reports
- provide tracking of profit, customer acquisition, product quality, and store performance.
The products page enables stall owners to set up their online menus.
Menu and product creation.
Variant Management
- the capability of owners to add different variations of their products. They can also add separate images for each variant.
The orders page displays the list of all orders from chatbot customers. Owners can manage the orders from this page such as accepting, declining, and tag ready for delivery/pickup. By also clicking the name of the customer, owners can start a conversation with the customer
The Stall Setting page is where owners fill up information about their online store which shall also be visible to the customers.
Chatbot
Kayenna is Carinderia PH's very own virtual assistant that lives in Facebook Messenger. She will help users find their favorite local eateries, get info about their business, and showcase their products. Using the power of Natural Language Processing and Artificial Intelligence, Kayenna will be able to give users results on eateries that serve their current cravings.
Some customers are very particular with the carinderias they patronize, so at the tip of their fingers, they can get notified if their go-to is ready to take orders or have a new menu to offer. Carinderia PH automatically sends these orders to the respective stalls while customers can track the progress of their orders in real-time or even start a conversation with the owner regarding their orders.
Users can also help out their favorite local eateries by giving them ratings and feedback that will help other customers find their new favorite carinderia.
Carinderia PH Chatbot in Facebook Messenger
After clicking 'Get Started' Kayenna will welcome customers with menu options to choose from
Customers can simply type in their cravings and Kayenna will find stalls that serve the food they're looking for. After clicking the 'VISIT' button, customers will be asked one time to set their delivery address by pinning their location on the map. After this, nearby stalls serving the searched food will be displayed.
Stall information such as delivery hours, location, reviews is viewable by tapping the info button.
Adding food to the cart and checkout process.
Customers can view the progress of their order after receiving each order status notification
Customers can type in their inquiries about their order.
Aside from tapping 'OPEN STALL' in the main menu, stall owners can also type in their request to register their store in our platform.
Conclusion
As everyone adapts into the new normal, Carinderia PH will seamlessly connect patrons with their favorite carinderias with safety and convenience. Carinderia PH, digitizing local eateries.
How we built it
Chatbot
- We created the Chatbot using Microsoft Bot Framework V3 on Node.JS and channeled to Facebook messenger utilizing Facebook GraphAPI.We integrated Wit.ai as our NLP and product entity training programmatically from the dashboard.
Dashboard
- We built the dashboard using typescript Apollo client, AWS amplify, GraphQL, base web as UI Framework. We used Cloudinary for image repository.
Backend
- We are hosting on different providers, our Microsoft bot framework bot service is deployed in Azure while the rest of the backend is running on AWS Lambda, Appsync.
Challenges we ran into
We initially had problems with how we are going to train each entity such as product and name of stalls; fortunately, Wit.ai can create objects and feed it to existing trained utterances programmatically. As we aim the customers to re-engage with the bot, the one-time notification had given us the option to subscribe users to unavailable products or currently closed stores then notify them once the product and stall are ready for orders.
Accomplishments that we're proud of
We have combined some of the most modern and advanced software development tools, services, and frameworks out there to create a platform with an impact on one of the most crippled businesses today, local dine in.
What we learned
For the dashboard, our team has explored and used some cool and convenient stuff such as codegen tool of Apollo GraphQL and entirely run its interface on Now serverless. It is also our first time implementing a one-time notification and the newer version of Wit.ai for a chatbot while the backend services are also running on serverless Lambda.
What's next for Carinderia PH
Keep it live for the food stall businesses and help communities share more of their delicacies to their neighbourhood.
Should all goes well, add cashless payment integration and delivery service.
Built With
amazon-web-services
facebook-graph
graphql
microsoft-bot-framework
next.js
node.js
react.js
typescript
wit.ai |
10,009 | https://devpost.com/software/ai-commercebd | services it can provide (quick replies)
welcome screen
Get COVID19 information of Bangladesh & World wide from authentic website
Inspiration
Right now we are going through a very tough time. This
COVID19
pandemic has hit all the fields, damaged the health industry severely.
All the efforts have been futile, now the best thing that we can do is stay at our home. But also we can not deny the basic needs that we have to purchase on a daily/monthly/weekly basis. So, does that mean we can't afford to stay at home?
What it does
Here comes our AI-driven solution which is a messenger chatbot
(AI-commerce)
, what it intends to do is - the bot asks its users if they need anything or on the other hand asks if they are outside & can deliver any goods to those who are in their houses. So, they don't need to come out.
Basically our system divides the users into two groups, one is the customer who requests to receive service & the other one is the postman who is already outside & wishes to provide service to those who are in their homes.
By doing this our system ensures to minimize the number of people going outside & prevent community transmission to a certain extent.
How I built it
Technologies:
ChatFuel
Google-SpreadSheets
JSON objects
Facebook Page integration
Challenges I ran into
As it is in the development stage, we had to test it by approaching the mass people to know their feedback. Those who used it were very much impressed and also willing to use it right away on a local market. They demanded more features like choosing the product.
Accomplishments that I'm proud of
Right now our system is able to make a connection between those who need service with those who are willing to provide service. In this situation where many are losing their jobs, we hope service providers can earn through our system. On the other hand, those who have tested this system highly appreciated & wanted to use this right away. At that moment we were very proud that we are capable of at least doing something in this current scenario for our community.
What I learned
How a traditional e-commerce site works.
The main intuition behind this.
How to reach to the specific customers.
Understand user needs
How we can make a bridge to the customers with the service providers.
How to make this system dynamic with integrating google spreadsheet with the messenger chatbot.
What's next for AI-CommerceBD
Right now users can request for service & those who are outside and wish to provide service can receive customer requests.
In the future, we intend to give the user options from the messenger what they can buy from the local market.
As we are handling the payment offline and with the third party online transaction API
but in the near future, we intend to implement this transaction through
Facebook's
digital currency
Libra
.
Built With
chatfuel
google-spreadsheets
json
Try it out
m.me |
10,009 | https://devpost.com/software/dms-and-dragons | 💡 Inspiration
When coming up with our idea, we really wanted to make a chatbot that pushed the boundaries and leveraged some unique features of the Messenger platform in a different, interesting way to create an experience that would be engaging and conversational to all of its users.
Call us weird, but our minds quickly went to those 'creepypasta' horror stories that are written as text messages between its characters. There are full-blown mobile apps that take this idea further with interactivity and branching storylines, such as "Sara Is Missing", or "Simulacra", but we liked the idea of such stories playing out in a messaging app that is already familiar to the reader - amongst
real
conversations they are currently engaged in - to make it a more immersive reading experience.
We then saw this as an opportunity to also spark some creativity and enable Facebook users to use features of the Messenger platform to craft their own interactive story. Users can even expand beyond the horror genre typical to text message stories we see today. Will it be a "Black Mirror"-style dystopian sci-fi, or a fluffy romantic comedy between two strangers who met online? Thus, DMs and Dragons was born!
🤖 What it does
The DMs and Dragons Messenger bot has two main functionalities: it allows users to browse and read interactive stories, in addition to creating their own and sharing it with other users.
Once a user selects a story to read, the bot will take on the Persona of one of the story's characters, and begin the story by sending messages to the reader. There may be multiple Personas, and they may also send attachments. The reader interacts with the characters by responding with messages or Quick Reply options. Different replies can lead to different outcomes in the story, making it a truly interactive and engaging experience.
These stories are also created within the DMs and Dragons bot. This is done through a Webview, where users can create, edit and delete their own stories. Users can add different Personas for their stories, write messages or upload attachments to send to the reader, control typing times, determine how the reader can respond with Quick Replies, and create branches in the storyline.
👩💻 How we built it
Our bot backend is written entirely with Node.js and Typescript, and is hosted on Google Cloud Platform. In it, we make use of the Send API, Webhook Events, Quick Replies, Templates, Messenger Profile API, and Personas API.
The Webview is written in React and Typescript, and is hosted on Google App Engine. It utilises the Messenger Extensions SDK to get the context from the current conversation. The Webview also interacts with the backend to communicate what story the reader has selected.
We use Firestore as our database and Firebase Cloud Storage for keeping images for Personas and attachments. This is accessed by both the bot backend and the Webview.
🤔 Challenges we ran into
We came across a bug where
typing_on
and
typing_off
Sender Actions would not display as the Persona. As we couldn't do much about it, we chose to ignore the issue and carry on, but until its fixed it does take away from the immersion a little bit.
We initially used a library in our backend for abstracting the Messenger platform APIs, but ran into issues when implementing Personas, amongst other features, because it was outdated. We ended up needing to implement our own wrapper of the standard API.
We struggled to find an efficient way of developing our webhook and Webview. Initially, we were deploying both in order to test and debug, which increased our development times. We then came across ngrok and used it to tunnel our development servers on localhost.
We were initially quite ambitious with the features we wanted to implement, and realised it wasn't reasonable given the timeframe. We decided on a certain set of features to include for now that we felt were most important.
🌈 Accomplishments that we're proud of
Figuring out how we were going to structure the data for the stories was one of the most challenging parts of brainstorming, but we are proud of finding a solution that worked with our vision for this project and also allows us to expand on it with more features.
Learning something completely new - how to work with the Messenger platform and many of its APIs - and being able to create something with it.
We are proud of the concept and what it enables users to do with their creativity and the Messenger platform.
🎓 What we learned
Like we said above, we both learned the Messenger platform from scratch for this hackathon and are excited to use our knowledge for more projects in the future.
Being open to pivoting when things aren't working out is essential to keep up the drive to finish a hackathon project. Most of the time, there is a sufficient alternative, and some of the time, it is just as good.
Just because you're developing a project for a hackathon does not mean it stops at the submission. We really want to build on it in the future.
Faisha has officially become a Typescript convert.
🔮 What's next for DMs and Dragons
Allow storytellers to use NLP to process replies from the reader.
An expansion of the 'branching' system for stories: e.g. more complicated branching, or adding variables that are tracked throughout the story.
Support sending more attachments: videos, recordings etc.
A more comprehensive search engine for browsing stories.
An easier way to share stories between friends, e.g. a referral 'link' that sends the user to the bot and starts the story immediately.
Built With
facebook-messenger
firebase
firestore
google-app-engine
google-cloud
node.js
react
typescript
Try it out
github.com
github.com |
10,009 | https://devpost.com/software/medeli | Homepage
Inspiration
In the wake of the recent corona pandemic, business, especially retail, were severely-hit. Further, due to lockdowns and social distancing restrictions imposed all over the world, the chain of operations of retailers, from small to large, are required to undergo some drastic and quick revisions in order to be effective to meet the consumer demand. There is also a shift in consumer behavior, where now they wish to avoid going to the stores themselves for buying any goods and prefer ordering online. This will impact the retailers who have no means to receive online orders and depended solely on consumers ordering in-person at the store. Further, smooth supply of medicines and other pharmaceutical items is essential in such times.
We were inspired by these new challenges to build an online retail platform, which can be used by any pharmacy store operative to extend their services online in a simple and user-friendly way.
What it does
Medelivery is an online medicine retail app, wherein the customers can simply go to our webpage or our facebook page, upload their prescriptions, and place their order. We take leverage of messenger app's
quick-replies
to handle user conversations,
handover protocol
to connect with our human agents and
one-time notification
to update the user when medicine is in stock, if they tried to order out-of-stock medicine.
How we built it
The server side code is written in node.js and express.js.
For maintaining the database we have used mondoDB Atlas and mongoose.
For front-end of our webpage, we have used react.
For image-to-text extraction from prescriptions, we have used tesseract's api.
The client-user interface is realised using messenger platform and it is deployed on heroku.
Challenges we ran into
We had to come with a method to extract text from prescriptions image, and implementing it was the major challenge for us.
Accomplishments that we're proud of
We are happy to have been able to build something relevant and something which can be immediately deployed to use. We are proud of our smooth interface, minimal design and user-friendly interface.
More than all we enjoyed coding :)
What we learned
We are proud to have learnt so much working on this project. We learnt how to integrate messenger-chat-plugin in our website, query and update database in real-time, integrate image-to-text extraction api.
What's next for Medelivery
Although designed for pharmacy retailers, this app, with only some minor changes can be used by any retail business , grocery-stores to electrical appliances store. In this sense, Medeli is more of an online e-commerce platform designed to help retailers, especially small ones. We are enthusiastic to apply this platform to empower small to medium retailers to carry smooth operations in such drastic and uncertain times .
Built With
css3
express.js
facebook-messenger
html5
javascript
mogodb
mongodb-atlas
mongoose
node.js
react
tesseract
Try it out
github.com |
10,009 | https://devpost.com/software/roaster | Roaster
Get roasted for upcoming deadlines and your chronic procrastination.
Try it out:
messenger bot
Inspiration
Missing a deadline, procrastination, or just no motivation what so ever - we have all dealt with such treacherous paths during our time as a student. How do we remedy this, you ask? Well, we all love hearing that "ding" on our phones as we get a message - what if you got a bunch of those roasting you? Let us explain - what if they roasted you about your chronic procrastination, laziness, and just guilt-tripped you into making sure you don't fail that one 8 am class?
What it does
The bot first takes a calendar ID. And then, events are obtained from the google calendar API using the calendar ID and the user is reminded of the events via messenger. These events are assumed to be deadlines for assignments, meeting times, etc. The reminders simply roast you!
How to use it
Reach out to the messenger bot.
Text it your calendar id (google calendar) - it is your email associated with the calendar.
Get ready to get roasted and reminded!
That's it!
How we built it
We failed a lot, but in the end, we stuck through with: Python, Flask, Heroku, MongoDB, Dialogflow, Google Calendar API, and the Facebook Messenger API.
Challenges we ran into
It turns out the scheduling code to be run at specific times for specific users is not as easy as it sounds. We tried out a plethora of technologies, Redis scheduler was one of the most notable ones it was the closes we got but sadly we were unable to make it play well with Heroku.
We spent a while dealing with some outdated and random changes in the google calendar API. The documentation definitely had a slight learning curve before you could completely understand and take advantage of it.
Accomplishments that we're proud of
Built consumer-centric robots that might not be friendly but can definitely be helpful.
Learned a lot about the stack we used!
Made a new friend!
What we learned
Learned a lot about the technologies in our stack!
Heroku, Flask, Python, Google Calendar API, Dialogflow, Google Cloud Platform, and Facebook Messenger API
What's next for Roaster
Adding more fire and fuel!
Built With
dialogflow
facebook-messenger
flask
google-calendar
heroku
mongodb
python
Try it out
m.me
github.com |
10,009 | https://devpost.com/software/buudu-https-devpost-com-software-budu-nsouh9 | Inspiration
Buudu is a project inspired by the creative member of the team Mr. Sorgho Lamoussa who needed a way to connect with people while traveling by locating nearby meet-ups within the airport's proximity to engage himself while waiting for check-ins. He said "...during my travel to California for F8 I noticed a lot of lonely people on transit who needed some form of social interaction and a healthy way of finding one", I (Anitche Chisom) understood this need because as a former member of Devcircle Lagos it was difficult for me to connect with DevCircle Ouagadougou when all along it was literally happening in my backyard so, a few conversations and a Hackaton (opportunity to secure the funding) later we began working on Buudu.
Along the road, we realized the scope of the project was limited so we decided to make it more diverse so that it covers more gatherings or meet-ups like sensitization campaigns, political rallies, and entertainment concerts in a way that enables us to categorize them under one term...Events. However the unprecedented Covid-19 epidemic opened up our eyes to another opportunity we missed to cover which was including virtual conferencing in the agenda, this very particular feature sets Buudu aside from other pre-existing projects that may share similarities with it...trust me, we did the research.
What it does
There are two kinds of users on Buudu, the Organizers and the Guests.
Guest Users
To a guest, the application offers an easy way to book a seat/get a ticket to an event, track events using their coordinate as a filter to prioritize the order in which these events are displayed on their devices. And because the location is strictly authors by the application's admin and not randomly inputted by the public, every detail from physical address to coordinate is provided to enable the most efficient way of getting to its location.
Organizers
For users in the category, Buudu offers a very easy way to:
track guests,
issues tickets,
check-in tickets
and keep the guests updated on any Need to Know information concerning the event.
How I built it
Challenges I ran into
The most challenging aspect was putting together a team of people who are experts in their fields and are willing to contribute their skills and resources to the project with nothing to look forward to but hope in the success of the application.
The second was the Covid-19 outbreak, as fun as it may be to stay connected via Zoom, messenger or Whatsapp, the truth is nothing can replace the feeling of physically coming together in a room and spend all day brainstorming. This also made it difficult to find more people who may be able to help us with the project because of the mandatory lockdown by the government.
Accomplishments that I'm proud of
Considering the lack of skilled experts and social distancing, we managed to accomplish a lot in a very short period of time, am very proud of the level of determination and effort that was put into the project by the team.
What I learned
Working with Mr. Sorgho Lamoussa opened me up to a world of bots that I've never really explored. I now see how this could be implemented to improve visibility and also tap into the vast community of Facebook to draw more inspiration.
What's next for Buudu connecting people
Buudu still has a long a way to go before RC and a lot is still going to be done like to improve it like: UI enhancement and testing, hopefully, before the next DevCircle Ouagadougou meetup we hope to seize the opportunity for a beta test, by then we should have a native application for IOS and Android phones/tablets to further extend the functionalities of the application like barcode scanning, push notifications and a more precise guest tracking to enable a smart communication between guests and organizers, and when we get to a release candidate phase, publish a public API for developers to integrate a quick "Attend Event" button on their application (only for free ticket events).
Suggestions for Facebook
I think it would be a good idea to have "addons" or "extensions" that would allow Page admins to subscribe to services they already use to modify their bot menu. With this, we could create plugins to enable other pages to quickly integrate the "Attend Event" link their page's messenger bot menu after all most pages are either owned by artists or organizations constantly organizing events.
Built With
express.js
laravel
messenger
node.js
php
request
Try it out
github.com
github.com
buudu.herokuapp.com
www.facebook.com |
10,009 | https://devpost.com/software/sheru | Inspiration
Whenever I go to gym or workout at home I always end up wasting time on exercise to do next and I think this is a waste of time. I wasn't like this when I was in high school. But then I remember that when I went to high school our gym teacher would give us a list of workouts and we would just follow that list everyday without wasting anytime on thinking what to work on. So I created this messanger bot which I think will help become more efficient.
What it does
Initial goal: Depending on the level, bot will create a workout for you based on what part of your body you want to work on. Then, you can just log your workouts when you are done. If you don't like a certain exercise you can always request another exercise for that muscle group. For example, I can say something like Sheru(my bot's name), give me a workout for chest and back. Or, give me a workout for shoulders, biceps, and triceps using bodyweights only. Or, give me a workout for legs and back using weights. Or, give another workout for triceps. To show your progress you can say something like, show me what I did yesterday. Or, show me last weeks workout. To log, you can say something like, sheru log my today's workout: bench press, push ups, sit ups, and crunches.
Currently finished: Currently I can process everything through wit.ai, but having trouble with my database, so I can't really output anything after.
What I learned
I learned how to use node.js, wit.ai, heroku, and postgresql
What's next for Sheru
I want to finish my initial goal and make it look nicer by using templates rather just sending text back to the user.
App name on wit.ai is "sheru-messaging-hackathon".
Built With
express.js
messangerapi
node.js
postgresql
wit.ai |
10,009 | https://devpost.com/software/easymedfill | Consent verification
Info checker
Greetings!
Inspiration
My teammate and I have faced this problem first hand while visiting clinics, or individual doctors. The medical industry remains largely traditional when it comes to collect and storing patient data with most clinics still relying on paper, forms and transcribers. With an easy conversational way to fill out basic forms, patients are saved time in the waiting room, and the clinic is saved the cost of printing, transcribing, and handling paper forms.
Using the bot, a patient can fill out a form in less than a minute and receive e-copies of the information they've provided. If they choose to keep the autofill, it takes them less than 30 seconds to provide consent and fill out a form.
What it does
It collects your personal information to fill out medical forms prior to your visit to a clinic. It stores this data for autofill for your visits to the same clinic in the future. A electronic record of your form is now available for you, your doctor and your clinic admin.
How I built it
I used messenger's bot framework, leveraging quick replies to keep a standard format for user responses and making the app user friendly. I used messenger extensions to embed html in the chat so users can select more than one race!
Challenges I ran into
A patient can have more than one race, the information had to handled very carefully. Patients had to be given the option of deleting their records. Date formats and name formats had to correctly handled.
Accomplishments that I'm proud of
Manged to pull together an MVP in a very short time while making the big move from the east to the west coast!
What I learned
I learnt how to build bots, and how to embed HTMLs in messenger, how to curate bot flows for user friendliness.
What's next for EasyMedFill
Adding specializations for dermatology practitioners, and general practitioners that will enable doctors to review the patient's pain points prior to the visit, and enable better communication between them during the actual visit!
Built With
bot-framework
html5
javascript
messenger
messenger-extensions
Try it out
m.me |
10,009 | https://devpost.com/software/well-beings | WQS self assessment test layout
Media kit
Mind map
Messenger screen #1
Messenger screen #2
Inspiration
Mental disorders affect
one in four people
. Treatments are available, but nearly
two-thirds of people
with a known mental disorder never seek help from a health professional. The stigma around mental health is a big reason why people don’t get help. This needs to change. By changing the attitude towards mental health in a community setup, we believe we can create a domino effect of more people opening up as a result of increased social and sympathetic views on mental health.
Our Solution - Wellbeings: A Community
Wellbeings is a Mental Health Community. Unlike most mental health communities, Wellbeings is inclusive to even people that are unaware of mental health problems. This community is called Wellbeings because we want to de-stigmatize mental health.
Our solution to the problem is to provide access to vital information so that people can educate themselves on types of mental health problems, identify any warning signs by a quick self-assessment, information, and resources including helplines, advice on helping someone else, tips on wellbeing, etc.
We want this done in the most interactive way possible, which we believe we can achieve by creating a chatbot and a community that is synonymous with peer support groups. We want to focus on the idea that people with mental illnesses are not abnormal or some isolated group of people, but as many as 1 in 4 people in the world will be affected by mental disorders at some point in their lives. By creating a community, we want to reach out to the victims as well as the general public because they are likely to know someone who suffers from mental illness.
Collectively in a community setup, we harness a "me too" feeling and help members become advocates of mental health.
To sum up, we aim to
advocate the importance of mental wellbeing,
make information accessible and available,
tackle stigma,
empower community,
support people by aiding recovery through early identification & intervention.
Who are we?
We are a team of 4 people - which consists of a developer, a designer, and 2 doctors. All of us share a common vision to improve the intricate health system with the use of revolutionary technologies. Mental health is one of the issues we feel strongly about.
How we built it
Our messenger bot is powered by wit.ai to handle all the NLP tasks. The webhook managing all the backend logic and scoring is built with flask. For the bot flow, we have used Chatfeul. And for the self-diagnosis of disorder, we have used the WQS standardized test.
Challenges we ran into
Most of the people don't even know that they are suffering from some kind of mental distress, so they usually are not engaged with apps and bots marketed as self-diagnostic/help apps. To even reach to that naive user, we have taken a community approach to engaging him by providing a comfortable community which will be pictured by the user as the answers to his unknown problems. Once engaged, we can help him use our bot to take the assessment and know about his/her mental wee being.
Most of the self-diagnostic tests available are lengthy or too monotonous so implementing it in a bot is not a good experience and the drop out ratio of users becomes high. So our team of health professionals selected WQS from various different standardized tests and modified it to be more interactive and less negative to increase conversion ratio. Also, the questionnaire has around 50 questions but we have made it dynamic to the users are not given disorder-specific questions if his/her response is negative to the screening question. So for a normal user, the effective number of questions is around 15-20 which improves the number of people who complete the test.
What's next for Well Beings
We don't stop here. We aspire to engage as many people as we can and bring this to every person who is unknowingly possessed by this demon. We also want to educate the community about mental well being so that they can understand its importance and observe the silent cues of people in distress.
We plan on scaling this solution in the following ways
Incorporate health care professionals to help members with an accurate diagnosis
Add a database of country-wise helplines
Work on suicide prevention
Improve the self-help questionnaire
Make our bot even smarter (Thanks to wit.ai)
Incorporate CBT (Cognitive Behavioural Therapy) to assess and help people with mild symptoms here in our community only.
Built With
chatfuel
flask
glitch
messenger
wit.ai
Try it out
www.facebook.com
glitch.com
m.me
mm.tt
www.mindmeister.com
www.figma.com |
10,009 | https://devpost.com/software/a-41ucf5 | Booxly is and intelligent bookmark system that assists you in reaching your reading goals!
One book, two book... a hundred book?! Not a problem for us!
Upload your books cover page
Monitor your reading habits and improve from day to day
You don't need to download the app, just launch Booxly in Messenger!
Inspiration
In our company, we have a competition called BookQuest. Every week we read professional books, and after every read page, individuals receive points - we compete to win the event by reading, learning, and improving. It is a friendly competition, but it motivates us - and every Monday the winner is celebrated. This was our inspiration, and we created Booxly exclusively for this hackathon. ⭐
What it does
Booxly is an automated bookmark system. It summarizes the read pages, monitor reading habits, and could help the users to reach their reading goals! After every completed book, Booxly generates a shareable image - so we can share it on Facebook.
We aimed to create the most significant impact on our user's life with the least energy investment on their end.
So the app is extremely user friendly and easy to understand, while by its functions the users can improve themself every day.
How we built it
• We used our own Node JS chatbot framework for the chatbot intelligent
• We used our custom #C image generator system for the generic template
• We have done a lot of UX testing and pretotyping
Challenges we ran into
• We wanted to use a book database because of the covers, but we couldn't find any online that contained vintage and local books as well. So we allow the users to take a photo of the covers of their books, and upload it.
• After that, we wanted to receive the cover photo inside the conversation - we planned to automatically identify the book cover on the picture, cut it out, and distort to remove the perspective. We didn't have the result we hoped for, so we moved forward with a manual cover cropper, that could be used inside the webview.
• We had to generate default covers for the book - because we need a cover image to we can use for the generic templates
• We had to prepare different user flows for Bookmark saving, based on the bookmark page number and the number of active books - it needed a lot of UX testing. We created a smooth experience for the users.
Accomplishments that we're proud of
• The user flow is smooth and so easy to understand to use! We spiced things up with the server-side generated generic template images.
• The conversation looks astonishing with the rich graphical communication (in our opinion).
• The book cover cropper was a great idea; it is much easier to use and increase the UX.
• We created an app that we will use every day in our company - we are proud of our team's effort.
What we learned
• We researched the essential user needs we need to fulfill - luckily, we already had experience based on our BookQuest, but things are a little bit different in a chatbot.
• To see the book's cover is essential; this way, users can connect the digital books with the physical one. We use the conversation interface as a controller and logger; users can review their behaviors.
• After we finished the development of the conversation design, we identified a new need. Sometimes, it is easier to handle and oversee our statistics in a GUI, so we created UI for webviews, where users can use every function.
What's next for Booxly
We will make Booxly
multilanguage
so that it will become an international application; we will start with
🇭🇺 Hungarian.
We will include a
notification system
- users can set up notifications to reach X pages every week. (we planned to do it in this Hackathon, but we run out of time 🕒)
We will include a
gamification
system - we will reward users with points and badges, and they receive quests every two weeks:
Badge
examples
• Book worm - Read 300 pages a day
• Read Sprint - Finish a book in 3 days
• Page Chain - Read five days in a row
...
Quest
examples
• Read %500% page in 2 weeks
• Finish a %Fantasy% book in 2 weeks
• ...
We want to recommend books after a user finished one book - based on the category, writer and title
Our team
• Csézi
• KZ
• Tom
• Yusuf
Built With
c
javascript
mongodb
mysql
node.js
php
react
rest-api
sass
symfony
Try it out
m.me |
10,009 | https://devpost.com/software/botzi | Inspiration
The Tal Institute in Jerusalem inspired us to create a solution for this problem that we're having during crisis.
The problem of Volunteers and NPO's who are looking for one another but at the end, don't find each other.
Within a moment, we saw the opportunity, and understood that there's a way to make this whole system faster, smarter and to make it match the people in a second and with smart matching algorithms! and here we are! :)
And to make it even a little better, why not creating something that does all that in different languages???
What it does
Botzi is a Messenger bot that asks you a few questions and decides whether you're a NPO or a Volunteer and what you need, with that, he creates a user on our Botzi site where you can see your profile as a Volunteer or NPO and the campaigns to sign-up/ that you signed-up and so on so that you can also do everything manually.
Other than that, Botzi will notify you on FB Messenger when there's an available match.
And how do we market all of that and make it cool ? SPARK-AR! we describe the NPO's activity and main field with cool filters that create an amazing "gamey" and fun activity :D
How I built it
Python with Flask to communicate with Messenger, sitting in AWS EC2, and writing/reading data to Firebase Firestore/Auth. A site based on Angular with Sign-in/up methods that help you navigate all that visually if needed.
Spark-AR Studio for the filters.
Challenges I ran into
Connecting AWS and Python and AWS and auth made us switch some of it to firebase, getting/posting messages to the messenger, fetching the history and saving the current user's conversation to know what to do.
Creating a site with Angular wasn't the easiest to do as well.
Also, most of the product was built by 2nd-3rd Year Bsc Students in Tal's Institute which came from their Hackathon to make this dream happen, with a vision of it being a good product, but it took an effort to get 20 (! at first) people to work together and create it, when most don't have much experience.
Accomplishments that I'm proud of
The bot talks! we even got to get it live in Facebook! our site is soon to be finished and all this crazy system is coming to life!
Putting all the pieces together, creating a team, teaching, studying, writing, learning, trying, has been a huge challenge to me and my team, and we're super proud of everything, and we know it won't be for nothing! you'll see this as a product! (Already talked with Jerusalem's Municipality and they are interested in using such system!)
What I learned
Management and product management in big teams. How to connect to messenger API and use it correctly with Python and JS.
A lot about AWS also, I can't even write down how hard connecting everything there was...
What's next for Botzi
BOTZI MUST MAKE MATCH!
Botzi's going to be an awsome matcher and will help NPO's to create a better world, and more than that our vision is that Botzi's going to make Volunteering as easy as dating on Tinder, and deciding what campaign to take as easy as browsing Netflix.
Even if we won't take the first 500,
You'll hear from Botzi, Remember us!
Built With
amazon-web-services
angular.js
css
firebase
git
github
html
javascript
particle
python
sparkar
Try it out
www.facebook.com
botzi.org |
10,009 | https://devpost.com/software/flatten-that-curve | Welcome screen to the bot, Dott!
Search for various charities
Get supporters via sharing the facebook message or whatsapp
Supporter seeing what the Hero is trying to achieve
Pledge in $ or ZAR, currencies easily added
Pay your pledge via paypal
Why
Charities are stuck in their old ways and often do not have the resources and innovation to use
technology to their advantage. In this digital age, many of them are lost and are of the older
generation. They need as much assistance as possible to transform the way they do things,
especially fundraising. At present there are 208 000 charities that have been registered in South
Africa, many of them on the brink of closing down. CONECKT aims to connect these organizations to
each other in order to facilitate the sharing of ideas, information, donations and resources. In
addition, CONECKT sources valuable information from various platforms and shares these
opportunities through technology and networking. Social workers, managers and skills centres are in
crises and they are all asking for money from people who are no longer able to give.
Corona has ceased many of the fundraising initiatives that used to bring in an income- such as a golf
day or sale. It has also created a massive divide between the haves and have nots. The people who
put on weight during lockdown vs the millions who are literally starving.
Inspiration
There are 3 main Issues that we identified:
1
As Covid-19 hit our shore, it soon became clear that
comorbidities
severly impact survival rates
link
.
Hypertension / Obesity is a preventable disease, which can be countered with good diet and exercise
2.
Lockdown has had many detrimental effects on our
mental health
, such as loss of socialization, purpose and personal connection
link
. In preparation for the second wave of infections, we need to set a foundation for people to find purpose and belonging in a virtual society
3.
Due to certain companies being unable to trade during lockdown, their contributions to
charity
have lessened. Furthermore, massive job losses mean that families that usually would contribute to charity now have to manage with added debt burden and less disposable income
link
. The effect is that charities traditional income stream has taken a knock, which could lead to their closure
Our solution
Due to my video editing software deciding it has had enough, unfortunately you have to open the 3 videos below in parallel to see it in action
We try and tackle all 3 problems using a proven fundraising model, where we motivate members of society (Heroes) to lose weight / get fit for their charity. Their friends and family pledge money towards them reaching their goal. All the while, the cause itself is encouraging and motivating the Hero to achieve their goal by sending videos and pictures that uplift and energize the Hero, making them feel like they are making a difference and showing the supporters where there funds are going
One of our previous successes
This model has 3 main types of users:
The Charity
The Organisation that would like to benefit from this challenge. Their benefits include:
Marketing as a beneficiary on the challenge
Financial donations for the heroes that achieve their goal
Marketing of their services through their messages to their Hero
Zero fuss signup
Hero
The person wanting to "Flatten their curves" by getting fit / eating healthy / setting a wellness goal
They volunteer to take up the challenge and identify their goal ( By consulting with a fitness consultant )
They set a goal that can be measured weekly. This ensures that they are on track and see progress
They gain supporters by sharing a unique link that track his progress and pledges
Supporter
The Hero's friends and family that would like to support the Hero's goal and pledge money to the chosen charity when the goal is achieved
They see their chosen hero transform through the challenge, be it losing a few pounds or gaining some stamina
What it does
In a previous iteration, keeping track of each Hero, their progress and their pledges was done using an Excel spreadsheet, which soon became unmanageable
For this hackathon, we decided to convert the challenge into a completely digital experience, and to automate the process to remove human administration and error. This is done by creating a virtual persona that would fill the role of facilitator of the challenge, allowing registrations, updates in progress, sharing of info and keeping in touch, all on the messenger platform
How I built it
Our team has been tirelessly working on this entry using the following technologies:
Using our experience using the Microsoft Bot framework, I was able to quickly leverage the benefits that facebook messenger provides over a traditional web based Sign-up form. The stack made it easy to save conversation state and implement decisions based on previous input, while the language generation module helped give the bot some personality.
Facebook's messenger is definitely the more mature chat and social platform, with easy UI elements that allow a rich interaction, combining the simplicity of chat and power of UI Templates and galleries. This makes the conversation natural and powerful
Azure services and visual studio, easily the leading partner when it comes to bot creation, coding and hosting. No other stack compares to the ease of use and maturity
Challenges I ran into
Charity Details
Many charities do not have an up-to-date website or contact information, so Naomi had to manually find and engage with them to onboard
Converting traditional web based signup
Traditional signup forms tend to ask all the information on one page, with context often being lost between pages. When working on messenger as the primary platform, context is easily kept as the user can just scroll up in the conversation. Making sure the bot keeps all the context entered and makes use of it instead of asking anew
Accomplishments that I'm proud of
Implementing the full experience using a bot
.
From signup to achieving your goal, no human interaction is required.
Streamlining the signup form
Traditional signup forms as for details that users are tired of entering (ie, name, surname, email address, confirm email etc). By leveraging facebook as a trusted platform, I am able to read basic user info with minimal effort and make it easy for the user to enter known values (see email and phone number quick replies)
Increasing social cohesion through the bot
As soon as one of the Hero's supporters sign up, I'm able to notify the Hero and tell them. This creates an interesting dynamic where the bot seems like a personal assistant, notifying appropriately and playing "messenger".
Using quick replies to improve interaction quality
When a hero achieves their weekly goal, the supporter is notified via messenger. This creates an opportunity for the supporter to encourage and uplift the Hero who has spent a week working towards their goal. We found that using quick replies with positive and uplifting messages, made the supporter choose it more than writing their own, and thus increasing the value of the interaction by setting a high standard.
Keeping it human
Most charities are charged with protection of those within it's care. For instance, animal charities like the SPCA (the South African ASPCA), have lots of dogs that now become cheerleaders and supporters. The motivation behind you achieving the fitness goal really lies in you knowing that people are counting on you to help them.
One of the organisations (Kadihma) specializes in care for those with mental disability. This organisation was able to send videos to encourage the Hero, directly from the beneficiary.
Here Noah was cheering on Trevor, giving him the indication that his efforts really make a difference
link
What I learned
People interacted with my bot as if it was a human, despite us making it clear that it is not. It shows that if you implement a proper logical, guided flow the user knows how to interact instead of creating a bot from the NLP and AI side first, which leaves the user with wondering what to do or what the capabilities are. Being upfront with options and guides and always prompting with a hint lets the user know what to do without them feeling lost or uncertain
What's next for Flatten That Curve
We would like to find partners and opportunity in the international market. The solution is scalable and generic enough to be applied to any country and organization.
Streamline the process further with more customization and enhancements
Built With
azure
bot
botframework
c#
messenger
wordpress
Try it out
coneckt.org |
10,009 | https://devpost.com/software/sign-language-dictionary-bot-9q3cht | Get started and type 'hello'. Bot sends you a video to show sign language for hello
Type 'Sorry' and bot sends a message to ask if user want to get notified when a new word is added to our dictionary.
Inspiration
Being Deaf, I have experienced first hand the challenges of accessing sign language dictionaries in African countries. I want to make a change for Deaf communities especially students who want to understand sign language and definition of words.
What it does
Using Messenger,
Sign Language Dictionary
bot enables Deaf and Hearing youths to ask or search words/phrases for sign languages or definition of the word. Also, they can chat with live DEP team when they can switch from a bot to live chat. Not only to look up for sign language or definition but also you can ask it for fingerspell a word or your name. Sign Language Dictionary Bot is available on
Facebook Messenger
.
How I built it
I developed it using some
Facebook tools
and
Google cloud tools
. It is developed using Messenger API and Oxford dictionaries API.
Challenges I ran into
The main challenges faced by the team developing Sign Language Dictionary have been the lack of resources in sign languages from different countries.
Accomplishments that I'm proud of
Sign Language Dictionary bot works great. I am looking forward to adding new features and to scale it.
What I learned
I learned that Facebook Messenger is very useful for Deaf and Hearing students in the future. Also, I learned that it is possible for users to upload a video via Messenger but I am still working on it.
What's next for Sign Language Dictionary Bot
Sign Language Dictionary Bot is the first Facebook Messenger that enables you to ask a word or phrase for sign language or the definition of the word. Also, it can show you fingerspell word or your name. One of main activities we will need to do:-
Allowing users to upload a video (sign language) and to add a new word to the Sign Language Dictionary database via Messenger.
Built With
facebook-messenger
firebase
firestore
google-analytics
google-cloud
javascript
node.js
oxford-english-dictionary
visual-studio
Try it out
m.me
www.deafelimuplus.co.ke
www.facebook.com |