Skip to content
February 3, 2026
Mochiai.blog
Mochiai.blog
Random Article
  • Home
  • LLM inference library

Tag: LLM inference library

Tencent Hunyuan Releases HPC-Ops: A High Performance LLM Inference Operator Library
Categories Artificial intelligence

Tencent Hunyuan Releases HPC-Ops: A High Performance LLM Inference Operator Library

  • By Michal Sutter
  • January 28, 2026

Tencent Hunyuan has open sourced HPC-Ops, a production grade operator library for large language model inference architecture devices. HPC-Ops focuses on low level CUDA…

Read More

Loading...

Categories

  • AI Medical
  • AI Reasoning Model
  • Artificial intelligence
  • Best Exam for AI
  • Cybersecurity
  • Machine Learning
  • Programming & Tech
  • Technology
  • Uncategorized
  • VM

Archives

  • February 2026
  • January 2026
  • December 2025
  • November 2025
  • October 2025
  • September 2025
  • August 2025
  • July 2025
  • May 2025
  • April 2025
  • March 2025
  • February 2025
  • January 2025
  • December 2024
  • November 2024
  • October 2024
  • September 2024
  • August 2024
  • July 2024
  • June 2024
  • May 2024
  • April 2024
  • April 2016

Copyright © 2026
 - Powered by Magze.