The shared_cache
package provides a in-memory storage solution that can be used with FastAPI or any other Python application requiring efficient, inter-process caching.
Internally using the built-in multiprocessing.shared_memory
and msgpack
.
pip install shared_cache
# main.py
from fastapi import FastAPI, HTTPException
from pydantic import BaseModel
from typing import Any, Hashable
import asyncio
from .models import Item
from shared_cache import Cache
app = FastAPI()
cache = Cache(maxsize=1_280_000) # in mb
@app.on_event("startup")
async def startup_event():
await cache.clear() # Ensure cache is clear at startup
@app.post("/set")
async def set_item(item: Item):
await cache.set(item.key, item.value)
return {"message": "Item set successfully"}
@app.get("/get/{key}")
async def get_item(key: string):
value = await cache.get(key)
if value is None:
raise HTTPException(status_code=404, detail="Item not found")
return {"key": key, "value": value}
@app.delete("/delete/{key}")
async def delete_item(key: Hashable):
await cache.delete(key)
return {"message": "Item deleted successfully"}
# In both cases below, all 4 workers will have access to the same cache
uvicorn main:app --workers 4
# or
gunicorn --workers 4 -k uvicorn.workers.UvicornWorker main:app
To test the Cache
package, you can use the provided test suite. The tests include inter-process communication scenarios to ensure that the cache works correctly across multiple processes.
- Run the Tests:
pytest tests/test_between_workers.py
Contributions are welcome! Please open an issue or submit a pull request on the GitHub repository.
This project is licensed under the MIT License.
For any questions or inquiries, please contact [email protected].