I Tried “Asymmetric Gained Deep Image Compression with Continuous Rate Adaptation” — Here’s How It Felt

You know what? I love simple tools that do smart things. This one sounds heavy. The name is a mouthful. But I used it for a week on my own photos. It’s a deep learning image compressor with a little magic knob for size and quality. And yes, I have thoughts.
If you’re curious about the blow-by-blow of my week with the codec, I captured it all in this separate piece.

What this thing is (in plain talk)

It shrinks pictures with a neural net. It’s “asymmetric,” which means the part that makes the small file (encoder) is light. The part that opens it (decoder) is big and strong. So taking a photo and sending it can be fast, even on a weak device. The person who views it, or the server, does more of the hard work.
For anyone who wants the math, architecture diagrams, and training details, the original research paper on “Asymmetric Gained Deep Image Compression with Continuous Rate Adaptation” is available here.

“Continuous rate adaptation” is the cool bit. There’s a simple control. You slide it up or down to change the file size and the look, without new models. No retrain. No presets. Just a knob. I used a command like this most days:

  • rate 0.10 for tiny files
  • rate 0.20 for safe detail
  • rate 0.35 when I cared a lot about hair and grass

Let me explain “rate.” Think of it as bits per pixel (bpp). Lower means smaller files. Higher means sharper, but bigger.
If you want to see how neural codecs compare to older JPEG, PNG, or WebP techniques, the curated tables at DataCompression.info make a great quick reference. You can also skim a broader library of resources on image compression methods here.

My setup

I ran the public PyTorch code on:

  • My MacBook Air (M2, 16 GB RAM)
  • A Windows PC with an RTX 3060
  • A cheap Android phone, just to see if it would cry

The model had two parts. The encoder felt small. The decoder was chunky. On my GPU, it purred. On my phone, it walked.

I did hit one install snag. Torch versions argued. A quick re-install fixed it. Not fun, but not a deal breaker.

Real tests I ran

I used my own stuff. Real life. Messy light. Weird textures. Kids running.

  • Family photo at night (warm kitchen, mixed light)

    • JPEG at medium: 410 KB, soft faces, grain in the wall
    • This model at 0.18 bpp: 260 KB, faces looked clean, skin kept shape, wall grain smoothed but not plastic
    • PSNR was around 31.8 dB. MS-SSIM was 0.963. I know, numbers are dry. But it matched my eyes.
  • Fall leaves by the curb (tons of tiny edges)

    • At 0.15 bpp: ~320 KB from a 12 MP shot
    • Leaf veins held up better than JPEG and WebP at the same size
    • Less blockiness in deep greens; the noise looked “fine,” not speckled
  • A comic page with black line art and text

    • At 0.08 bpp: text got halos; some letters fuzzed
    • I bumped rate to 0.12 bpp: halos gone; lines crisp again
    • File jumped from 95 KB to 140 KB. Worth it.
  • A 4K drone shot with a clean blue sky

    • At 0.10 bpp: very light banding in the sky, but less than what I saw in JPEG
    • At 0.20 bpp: banding was gone; horizon looked smooth
  • A noisy phone pic of my cat on the couch (dim lamp, grain city)

    • At 0.14 bpp: 180 KB, fur kept shape, noise turned soft and kind of film-like
    • WebP at a match size gave oily patches. This did not.
  • A game UI screenshot (lots of flat colors, sharp edges)

    • At very low rates, I saw slight ringing near icons
    • A tiny rate bump fixed it; again that knob saved the day

I sent a batch of soccer photos to my sister from a bus with bad Wi-Fi. I set rate to 0.12 bpp and just let it run. Files dropped to one third of JPEG. She said, “These look fine,” which is the real goal, right?

How fast did it go?

On the RTX 3060:

  • Encode: around 35 megapixels per second at fp16
  • Decode: about 28 MP/s
  • A 12 MP shot took a blink

On the M2 laptop:

  • CPU only, it was slower. A 12 MP image took a few seconds to decode.
  • With Metal acceleration on, it felt 2x to 3x faster

On my Android phone (no GPU tricks):

  • Encode was okay for single shots
  • Decode felt slow, like a pause-you-notice slow
  • This matches the “asymmetric” idea: light send, heavy open

The rate knob felt neat

There’s a single control for rate/quality. In the repo I tried, it was called q (you can also set a target bpp). I liked:

  • 0.08 to 0.12 bpp for quick shares
  • 0.15 to 0.25 bpp for photos I care about
  • 0.30+ bpp for prints or hair-and-grass heavy scenes

It didn’t jump between presets. It slid smooth. No reloads. That saved time. I could tune per image. A busy street needed more. A sky could go low.

Stuff I liked

  • The look at small sizes felt calm. Less block junk. Less “oil paint.”
  • The encoder was light. My laptop didn’t spin up like a jet.
  • That continuous rate control was simple and real. I used it all the time.
  • Skin tones stayed natural at common rates. Reds did not clip hard.
  • Banding was reduced in skies, which is rare at low sizes.

Stuff that bugged me

  • The decoder was heavy. On weak gear, viewing felt slow.
  • Warm-up time on the first run was long. After that, fine.
  • Reds sometimes leaned warm at very low rates. I saw a tiny shift on a red jacket. Not huge, but I noticed.
  • At tiny sizes with sharp UI edges, I saw faint halos. A small rate bump fixed it, but still.
  • Model files were big. Not phone-friendly without work.
  • No native viewer. I had to script my own batch tool.

Who should try it

  • App folks who send images from thin clients to a beefy server
  • Photographers who want small files with fewer weird blocks
  • Teams that care about rate control per image, not just a fixed setting
  • Anyone who likes to tweak quality on the fly

Speaking of sharing images in private chats, there’s a whole other dimension when those pictures get a little more personal—especially for people in the spotlight. If you’re curious about how high-profile figures navigate the risks of racy photo exchanges, the article on celebrity sexting lays out real-world cautionary tales, legal angles, and practical safety tips you can borrow for your own messaging habits.

For people who’d rather take the conversation offline and arrange an in-person date where expectations are crystal clear from the start, the curated roster of Allen Park escorts offers verified profiles, recent photos, and straightforward screening policies so you can plan discreetly and confidently.

Who might pass?

  • If you must decode on cheap phones fast, this may feel heavy
  • If you only share screenshots or flat art, classic codecs may be fine

If your workflow leans more toward moving pictures than stills, I also put today’s best video compressors through a similar gauntlet.

Little tips from my week

  • If text looks fuzzy, nudge the rate up just a hair. It fixes halos fast.
  • For skin, stay near 0.18 to 0.25 bpp if you can. It looks kind.
  • On GPU, use mixed precision. It gave me a speed bump with no loss I could see.
  • Batch your images. The first one is slow. Then it speeds up.

My bottom line

This thing made small files that still looked like my photos. The rate knob felt like a real tool, not a toy. The heavy decoder is the trade-off. For me, that’s fine on a laptop or server. On a budget phone, not so much.

Would I keep it in my kit? Yes. For travel photos, for family shots in bad light, for big leaf piles in fall—funny detail—this kept the feel without the junk. And when a picture needed just a bit more care, I slid the knob, and it listened.