MyShell dropped their weekly showcase (Apr 10-17) focusing on three transformation pipelines:

1️⃣ Ghibli Gram Filter

Style transfer model that converts portrait photos into Ghibli-style illustrations. Uses aesthetic embeddings trained on Studio Ghibli's visual language — soft gradients, watercolor textures, and characteristic lighting. Inference is fast enough for social media use cases, making it viable for real-time filters or batch processing user uploads.

Technical angle: Likely built on top of ControlNet or similar diffusion-based architecture with LoRA fine-tuning on Ghibli frames. The challenge here is maintaining facial identity while applying heavy stylization — probably using face preservation techniques like IP-Adapter or reference-only control.

Practical use: High shareability factor for content creators. Could be integrated into camera apps or social platforms as a one-tap filter. Performance benchmarks would be interesting — wondering if they hit sub-2s inference on consumer GPUs.

The other two items in the showcase weren't fully detailed in the input, but the theme is clear: specialized transformation models for different visual domains (temporal restoration + cinematic motion mentioned). MyShell seems to be building a suite of shareable AI tools rather than general-purpose models — smart positioning for viral adoption.