Server Update 2025-01-11¶
(This message is synchronized from Discord.)
MJAPI服务端最近继续进行了软件架构更新,模型推理效率显著提升。根据基准测试结果,在没有GPU的情况下,现在已经能够稳定支持至少1000用户同时使用标准模型。medium和mini模型能够支持更多,大约2000到3000用户同时使用。之后可能会提升用户数量限制进行测试。
The MJAPI server continues to have software architecture updates lately, and the efficiency of model inference has been improved remarkably. According to benchmark results, the server can support at least 1000 concurrent users of standard models without GPUs. The medium and mini models can support more, about 2000 to 3000 concurrent users. The limits of the numbers of users may be increased later for testing.