Hey everyone! Recently, I did a really quick comparison comparing two popular AI agents: Devin AI and Manus AI. You might know Devin as a specialized software engineer AI, while Manus is more of a general-purpose AI assistant. But how do they stack up against each other in practice?

As one of the early private beta users of Devin testing it almost a year for now, Iā€™ve had a bit of a head start exploring its features. Manus is newer and only available through a few selected demos on their website, so keep in mind these are probably cherry-picked examples. Still, I think this quick comparison can help us understand what each tool does well and where they could improve.

Alright, letā€™s jump into the demos!

P.S.: There is a bonus at the end. Iā€™ve noticed Devin gets a bit grumpy when it sees Manus :D


1. Education: Interactive Transformer Webpage šŸŒ

Prompt:

ā€œDesign an interactive webpage explaining the Transformer architecture, complete with visuals and interactive demonstrations.ā€

Manus's Interactive Transformers Webpage
Devin's Interactive Transformers Webpage

My Take: MANUS 1 - DEVIN 0

Devin did a meh job, Manusā€™s demo was much better. The website Manus generated looked more polished, and while it might not be super practical for learning the topic in real, it was impressive from a development perspective. I also really liked how easy it was to access both the frontend and chat logsā€”it made exploring the output feel seamless and well-structured.


2. Research: Analyzing DeepSeekā€™s Open-Source Projects šŸ“š

Prompt:

ā€œHelp me research and summarize five recent open-source projects by DeepSeek, complete with system architecture diagrams.ā€

Manus's Analysis of DeepSeekā€™s Open-Source Projects
Devin's Analysis of DeepSeekā€™s Open-Source Projects

My Take: MANUS 2 - DEVIN 0.5

This task was interesting. Both managed to search the repositories well. But for the diagrams, Devin tried creating diagrams in ASCII art šŸ˜… honestly, not my best method. Manus chose SVG diagrams, which were clearer and more professional. However, beither was near good (both had their share of mistakes), but if I had to pick, Manus did a better job overall.


3. Data Analysis: Sentiment Tracking for Claude 3.7 šŸ“ˆ

Prompt:

ā€œAnalyze initial public sentiment on social media toward Claude 3.7 post-launch.ā€

Manus's Work on Claude 3.7 Sentiment Analysis
Devin's Work on Claude 3.7 Sentiment Analysis

My Take: MANUS 3 - DEVIN 1.5

Here, Devin really stepped up a bit. Its website was little informative, clearly pulling relevant quotes and organizing the data well. Manus was good too. I felt Devin had the edge but both are fair. As we did not know how Manus would build a website from the reports.


4. WTF Case: Trump vs. Zelensky Simulation šŸŽ­

Prompt: MANUS 4 - DEVIN 1.5

ā€œCreate an interactive simulation where users can role-play as President Zelensky during recent political debates.ā€

Manus's Interactive Simulation
Devin's Interactive Simulation (?)

My Take:

Honestly, this was the craziest demo of all, and Manus totally crushed it. Devinā€™s simulation was not good, on the other hand Manus produced something actually engaging and usableā€”almost. Manus really shined when creativity and interaction come into play.


Bonus Fun: Devinā€™s ā€œOverhypeā€ Analysis šŸ˜…

Just for fun, I asked Devin to analyze if Manus AI is overhyped. Well, Devin got a little heatedā€”check out the hilarious report. Itā€™s is a must read šŸ˜‚:

Manus is Overhyped, Devin says

My Take: MANUS 4 - DEVIN 1.5 + šŸ¦

Looks like Devin has some strong opinions about Manus! šŸ˜‚ Itā€™s amusing to see these AI agents ā€œinteractingā€ this way. It makes testing them feel more personal and fun.


Final Thoughts: Who Wins?

Hereā€™s my overall impression:

  • Devinā€™s workflow feels more structured and transparent. You see every stepā€”planning, coding, testing. Itā€™s designed for detailed software engineering tasks expecially for integrating into already existing repos for pull requests. But sometimes this detailed approach can negatively affects other stuff. And, I am not sure if it delivers what they promise. Check this Primagen video šŸ¦: https://www.youtube.com/watch?v=QOJSWrSF51o

  • Manus, meanwhile, feels broader, but consistently delivers more polished final results in general scenarios (like research, design, or even fun simulations).

Overall, Manus feels really good at least this is what i got from the shared demos, in these general-purpose demosā€”it really impressed me in areas where creativity, interactivity, and quick, usable results matter most. Devin remains a pick for detailed engineering and structured workflows (still needs improvements), but Manus looks clearly has the edge in broader, general use cases for my initial very limited exploration.

I had a ton of fun putting this comparison together, and I hope you found it helpful! If you have any comments, additional resources, or if youā€™ve tried these agents yourself, please share below. Iā€™m planning to explore and compare even more AI agents soon, so letā€™s keep this conversation alive. We are also planning to share a new agent paper and benchmark stay tuned šŸ––