Hacker Newsnew | past | comments | ask | show | jobs | submit | radhakrsna's commentslogin

Yup, do try it out. It's Writesonic lyrics generator.


For sure :)


Glad you liked it!


We used artificial intelligence to generate an entire music video in the style of Eminem.

First, we used Writesonic which is an AI-powered writing assistant to generate the lyrics for the song. We used the phrase "I'm gonna change the world" as the seed text and music style as Rap. Writesonic's song lyrics generator was able to generate multiple different variations of lyrics from just this one line of input. We picked the one we liked the best.

Secondly, we used an AI model trained on Eminem's vocals to generate the voiceover for the lyrics.

Then, we used another AI model to generate background music. We synced up the vocals and the music.

Finally, we used OpenAI's Dall-E 2 AI to generate images for each lyric in the song and combined all of them into a video.

- Lyrics: Writesonic's AI Lyrics Generator

- Vocals: AI trained on Eminem's music

- Beats: Another AI Model

- Images: OpenAI's Dall-E

Here are the lyrics:

I am gonna change the world

With my music, my words

I'll make them listen

I'll make them hear

The message that I bring

It's time for a change

And I'm gonna be the one

To lead the way

I'll be the voice for the voiceless

The one who's not afraid to stand up

And be counted

I'll make a difference

In this world

With my music, my words

I'll make them listen

I'll make them hear

The message that I bring

It's time for a change

And I'm gonna be the one

To lead the way

Do you want to join me?

Do you want to make a difference too?

If you feel like it's time for a change

And you're ready to stand up and be counted

Then come with me

And together we'll make a difference

In this world

With our music, our words

We'll make them listen

We'll make them hear

The message that we bring

It's time for a change

And we're gonna be the ones

To lead the way

Try out Writesonic's AI Lyrics Generator: https://writesonic.com/


Hey everyone!

According to a McKinsey analysis, the average professional spends 28% of the workday reading and answering email. Imagine if you could get that time back!

That’s why I built Magic Email

Magic Email is an AI Email assistant that uses GPT-3 to let you:

1. summarize long emails into short, readable summaries

2. generate professional emails from one-line descriptions

3. generate contextual replies from one-liners or key points

4. auto-correct grammatical errors in your email

Join the beta here: https://magicemail.io

We’d love to get your feedback and look forward to answering any questions!


Hey!

Touch-less technology is the future (especially after the Covid-19 pandemic).

So, I made Zesture, a Mac/Windows app that uses your laptop's camera to give you touch-free control over your media, entertainment and presentation applications (without any extra hardware).

You can watch a demo video here - https://www.youtube.com/watch?v=-_swm09Xmtg&feature=emb_titl...

Supported Apps and Websites:

- Music: Spotify, Apple Music, Amazon Music, Spotify (Web Player), YouTube Music, Amazon Music (Web Player) and Deezer.

- Video: VLC Media Player, YouTube and Netflix.

- Presentation: Microsoft PowerPoint and Keynote.

️Supported Actions: Play/Pause, Next/Previous Track, Enter/Exit Full-screen, Forward/Rewind, Mute/Unmute, Volume Up/Down.

Privacy: Using your webcam is totally secure. Gesture recognition is done locally on your computer and hence we don't record, save or send any images or videos at all. Your camera is turned off automatically after a certain period of inactivity (configurable via the app).

We’d love to get your feedback and look forward to answering any questions!


> Touch-less technology is the future

What makes you think that's the case? What's the reasoning to start to use touch-less technologies, and why would they be the future? IMHO the lack of physical feedback makes it a non-starter though that's just my personal preference.

Edit: I forgot to add that it looks like a great product, and the landing page is quite good! I just don't see what are the arguments in favor of touch-less techs.


Some use cases I have personally worked on:

To allow surgeons to interact with my software within an operating room without the need for an assistant (to remain sterile).

Interactive retail displays outside the store. Users can interact with augmented reality displays and visualize themselves wearing the store products and/or to play a game to win prizes, etc.

Problems encountered:

Hardware adequate for long experiences e.g. Microsoft Life camera freezes after a few hours. Finding a device which can run 24/7 is a problem. Then once you found a good device you need to understand the risk of it being pulled from the market e.g. Primesense, Kinect, Intel RealSense (pulled and replaced by a new product and SDK, etc).

If a depth camera is used the type of bulbs to sunlight can interfere with tracking accuracy. If RGB is solely used then I am curious to see how well it works with various skin-tone in different lighting conditions and complicated backgrounds.

The "heavy arm/hand" problem. Try lifting your hand for 5 minutes and not putting it down. Users can be fatigued very quickly with a gesture based UX. Most products are not designed for this interaction.

In terms of Zesture:

The website is clean, to the point, great starting point. However I would like to:

- See an Enterprise license for long term support - Know how well it benchmarks against other SDKs/hardware solutions which achieve the same effect - Patents, does this infringe on other proprietary innovations? (do you have patent troll insurance?) - Guidelines for the best experience, e.g. distance from the camera if you were to use gestures to control a presentation - Roadmap, where are you going next?(FYI I am looking for a new way of hand based gestures which can be deployed via WebRTC and WebAssembly for interactive web based experiences :) )

Keep up the good work, looks promising!


Thanks, that's a really good comment, I appreciate the details :)


Hi, thank you for appreciating our product and landing page.

There are quite a few articles which are saying that touchless tech will witness growth much faster than earlier due to the COVID-19 outbreak.

Some links: https://www.securitymagazine.com/articles/92823-covid-19-and... https://builtin.com/design-ux/future-touchless


True, there are quite a few similar services but not many seem to work well. Our service provides better summarization (at least for the articles I tested), had additional features like extracting author name, publish data, important keywords etc and also comes with browsers extensions so you could summarize pages at the click of a button.

The method you described is a part of our algorithms but more steps are needed to make it give meaningful results and make sure it works on different kinds of articles.


Yes, you are right. There is still room for improvement and we will keep trying to make it better. The reason I added paid plans was to test whether people would be willing to pay for such a service so that I can spend more time in adding more features to it and making it better.


Fair enough, decent business strategy!


Glad that you liked it. Have you tested it out with the advanced summarizer? Thank you very much for your encouragement. I will try to keep improving it.


You could try the advanced summarizer. It gives better results.


Basic

* It's intended in part as a commemoration on the 50-year anniversary of the sketch, but also to draw attention to the need for a more streamlined peer review process for grants in the health sciences.

* "So, put together a Monty Python fan with a creative scientific mind and an expert in gait analysis, and this paper is what you get," Butler told Ars. Or, as they wrote in their paper, "It really is the silliness of the sketch that resonates with us, and extreme silliness seems more relevant now than ever before in this increasingly Pythonesque world."

* First aired on September 15, 1970, on BBC One, the sketch opens with Cleese's character buying a newspaper on his way to work—which takes him a bit longer than usual since his walk "has become rather sillier recently." Waiting for him in his office is a gentleman named Mr. Putey (Michael Palin) seeking a grant from the Ministry to develop his own silly walk.

* (Note: the name is spelled "Pudey" in the paper but we're going with the Wiki spelling.) Mr. Putey demonstrates his silly walk-in-progress, but the Minister isn't immediately impressed.

Advanced

* One of the best-known sketches from Monty Python's Flying Circus features John Cleese as a bowler-hatted bureaucrat with the fictional Ministry of Silly Walks.

* Waiting for him in his office is a gentleman named Mr. Putey (Michael Palin) seeking a grant from the Ministry to develop his own silly walk.

* For their own gait analysis, Butler and Dominy studied both Mr. Putey's and the Minister's gait cycles in the video of the original 1970 televised sketch, as well as the Minister's gaits from a 1980 live stage performance in Los Angeles.

* Butler and Dominy found that the Minister's silly walk is much more variable than a normal human walk—6.7 times as much—while Mr. Putey's walk-in-progress is only 3.3 times more variable.

* The sketch might be satirizing bureaucratic inefficiency, but Cleese's Minister is essentially engaging in a hyper-streamlined version of the peer review process in his meeting with Mr. Putey that (the authors concluded) resulted in a fair assessment.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: