Merge remote-tracking branch 'upstream/main' into develop-0.1.2

This commit is contained in:
chenxl 2024-08-12 12:31:49 +00:00
commit 650c368c18
2 changed files with 11 additions and 5 deletions

View File

@ -82,11 +82,12 @@ Some preparation:
<h3>Installation</h3>
1. Use a Docker image, see [documentation for Docker](./doc/en/docker.md)
2. You can install using Pypi:
2. You can install using Pypi (for linux):
```
pip install ktransformers --no-build-isolation
```
for windows we prepare a pre compiled whl package in [ktransformers-0.1.1+cu125torch24avx2-cp311-cp311-win_amd64.whl](https://github.com/kvcache-ai/ktransformers/releases/download/v0.1.1/ktransformers-0.1.1+cu125torch24avx2-cp311-cp311-win_amd64.whl), which require cuda-12.5, torch-2.4, python-3.11, more pre compiled package are being produced.
3. Or you can download source code and compile:
- init source code
@ -97,11 +98,16 @@ Some preparation:
git submodule update
```
- [Optional] If you want to run with website, please [compile the website](./doc/en/api/server/website.md) before execute ```bash install.sh```
- Compile and install
- Compile and install (for Linux)
```
bash install.sh
```
- Compile and install(for Windows)
```
install.bat
```
<h3>Local Chat</h3>
We provide a simple command-line local chat Python script that you can run for testing.

View File

@ -4,7 +4,7 @@
* @Date : 2024-07-16 10:43:18
* @Version : 1.0.0
* @LastEditors : chenxl
* @LastEditTime : 2024-08-08 04:23:51
* @LastEditTime : 2024-08-12 12:28:25
* @Copyright (c) 2024 by KVCache.AI, All Rights Reserved.
**/
#ifndef CPUINFER_TASKQUEUE_H
@ -51,7 +51,7 @@ public:
#ifdef _WIN32
ReleaseMutex(global_mutex);
#else
global_mutex.lock();
global_mutex.unlock();
#endif
}
};
@ -74,4 +74,4 @@ class TaskQueue {
std::atomic<bool> sync_flag;
std::atomic<bool> exit_flag;
};
#endif
#endif