Rate-distortion optimization for transformer inference
Split computing for language models, extending the theory of usable information.
Split computing for language models, extending the theory of usable information.
Isolate the common information between two dependent computer vision tasks.
Theoretical considerations and evaluation of split and distillation points.
Task reconstruction loss acts as a regularizer, increasing rate-distortion performance in coding for humans and machines.
Improving the shared channel in coding for machines (CfM).
A comparison between conditional and residual entropy codecs for a two-channel systems of tasks with nested information.