-
Notifications
You must be signed in to change notification settings - Fork 470
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Processing memory with status, capturing to, get list and more. #930
Conversation
WalkthroughThis comprehensive update introduces new functionality for processing memories, including status tracking and time capturing. It enhances the backend with new endpoints and models, while the frontend receives UI updates to reflect memory processing status. The changes also improve state management and event handling, providing a more robust and user-friendly experience. Changes
Uplevel your code reviews with Entelligence.AI ProEntelligence.AI ProIf you like this project, please support us by purchasing the Pro version. The Pro version has advanced context, superior noise reduction and several proprietary improvements compared to the open source version. Moreover, Entelligence.AI Pro is free for open source projects. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Files selected (5)
- backend/database/processing_memories.py (2)
- backend/models/processing_memory.py (2)
- backend/routers/processing_memories.py (2)
- backend/routers/transcribe.py (4)
- backend/utils/processing_memories.py (3)
Review comments generated (16)
- Review: 11
- LGTM: 5
Tips
Chat with Entelligence.AI Bot (@Entelligence.AI
)
- Reply on review comments left by this bot to ask follow-up questions. A review comment is a comment on a diff or a file.
- Invite the bot into a review comment chain by tagging
@Entelligence.AI
in a reply.
Code suggestions
- The bot may make code suggestions, but please review them carefully before committing since the line number ranges may be misaligned.
- You can edit the comment made by the bot and manually tweak the suggestion if it is slightly off.
Pausing incremental reviews
- Add
@Entelligence.AI: ignore
anywhere in the PR description to pause further reviews from the bot.
backend/utils/processing_memories.py
Outdated
def get_processing_memory(uid: str, id: str, ) -> BasicProcessingMemory: | ||
processing_memory = processing_memories_db.get_processing_memory_by_id(uid, id) | ||
if not processing_memory: | ||
print("processing memory is not found") | ||
return | ||
processing_memory = BasicProcessingMemory(**processing_memory) | ||
|
||
return processing_memory |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
There is a lack of error handling here. If the processing memory is not found, it simply prints a message and returns None. This could lead to unexpected behavior in the calling function if it expects a BasicProcessingMemory
object. It would be better to raise an exception when the processing memory is not found.
- print("processing memory is not found")
- return
+ raise ValueError("Processing memory not found for given uid and id.")
backend/models/processing_memory.py
Outdated
capturing_to: Optional[datetime] = None | ||
status: Optional[ProcessingMemoryStatus] = None | ||
geolocation: Optional[Geolocation] = None | ||
emotional_feedback: Optional[bool] = False | ||
|
||
memory_id: Optional[str] = None | ||
|
||
|
||
class UpdateProcessingMemory(BaseModel): | ||
id: Optional[str] = None | ||
capturing_to: Optional[datetime] = None | ||
geolocation: Optional[Geolocation] = None | ||
emotional_feedback: Optional[bool] = False | ||
|
||
|
||
class UpdateProcessingMemoryResponse(BaseModel): | ||
result: BasicProcessingMemory | ||
|
||
class BasicProcessingMemoryResponse(BaseModel): | ||
result: BasicProcessingMemory | ||
|
||
class BasicProcessingMemoriesResponse(BaseModel): | ||
result: [BasicProcessingMemory] |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The addition of the capturing_to
and status
fields in the UpdateProcessingMemory
model is consistent with the changes made to the ProcessingMemory
model. The new response models UpdateProcessingMemoryResponse
, BasicProcessingMemoryResponse
, and BasicProcessingMemoriesResponse
have been added correctly. However, there seems to be a typo in the result
field of the BasicProcessingMemoriesResponse
model. It should be a list of BasicProcessingMemory
objects, not a single object.
- result: [BasicProcessingMemory]
+ result: List[BasicProcessingMemory]
@router.get("/v1/processing-memories/{processing_memory_id}", response_model=BasicProcessingMemoryResponse, | ||
tags=['processing_memories']) | ||
def get_processing_memory( | ||
processing_memory_id: str, | ||
uid: str = Depends(auth.get_current_user_uid) | ||
): | ||
""" | ||
Get ProcessingMemory endpoint. | ||
:param processing_memory_id: | ||
:param uid: user id. | ||
:return: The processing_memory. | ||
""" | ||
|
||
update_processing_memory.id = processing_memory_id | ||
processing_memory = processing_memory_utils.get_processing_memory(uid, processing_memory_id) | ||
if not processing_memory: | ||
raise HTTPException(status_code=404, detail="Processing memory not found") | ||
|
||
return BasicProcessingMemoryResponse(result=processing_memory) | ||
|
||
@router.get("/v1/processing-memories", response_model=BasicProcessingMemoriesResponse, | ||
tags=['processing_memories']) | ||
def list_processing_memory(uid: str = Depends(auth.get_current_user_uid),): | ||
""" | ||
List ProcessingMemory endpoint. | ||
:param uid: user id. | ||
:return: The list of processing_memories. | ||
""" | ||
processing_memories = processing_memory_utils.get_processing_memories(uid) | ||
if not processing_memories or len(processing_memories) == 0: | ||
return BasicProcessingMemoriesResponse(result=[]) | ||
|
||
return BasicProcessingMemoriesResponse(result=list(processing_memories)) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
In the get_processing_memory
function, there seems to be a typo on line 47 where you're trying to set the id on update_processing_memory
instead of a new variable. This will likely cause an error as update_processing_memory
is not defined in this context. You probably meant to set the id on a new instance of BasicProcessingMemory
or similar.
- 47: update_processing_memory.id = processing_memory_id
+ 47: basic_processing_memory.id = processing_memory_id
processing_memory = processing_memory_utils.get_processing_memory(uid, processing_memory_id)
Also, similar to the update_processing_memory
function, the get_processing_memory
function doesn't check if the user has the right to access the processing memory with the given ID. You should add a check to ensure that the user is the owner of the processing memory before returning it.
processing_memory = processing_memory_utils.get_processing_memory(uid, processing_memory_id)
+ if processing_memory.uid != uid:
+ raise HTTPException(status_code=403, detail="User does not have permission to access this processing memory")
if not processing_memory:
raise HTTPException(status_code=404, detail="Processing memory not found")
In the list_processing_memory
function, you're returning an empty list when there are no processing memories. This is fine, but it might be more informative to return a message indicating that there are no processing memories instead of just an empty list.
return BasicProcessingMemoriesResponse(result=[])
+ return BasicProcessingMemoriesResponse(message="No processing memories found", result=[])
def get_processing_memories(uid: str, statuses: [str], limit: int = 5): | ||
processing_memories_ref = ( | ||
db.collection('users').document(uid).collection('processing_memories') | ||
) | ||
if len(statuses) > 0: | ||
processing_memories_ref = processing_memories_ref.where(filter=FieldFilter('status', 'in', statuses)) | ||
processing_memories_ref = processing_memories_ref.order_by('created_at', direction=firestore.Query.DESCENDING) | ||
processing_memories_ref = processing_memories_ref.limit(limit) | ||
return [doc.to_dict() for doc in processing_memories_ref.stream()] |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The get_processing_memories
function is missing error handling for the Firestore operations. If there's an issue with the Firestore operation (e.g., network issues, permission issues), the function will throw an exception and potentially crash the application. It would be better to wrap these operations in a try/except block and handle any exceptions that may occur.
def get_processing_memories(uid: str, statuses: [str], limit: int = 5):
processing_memories_ref = (
db.collection('users').document(uid).collection('processing_memories')
)
if len(statuses) > 0:
processing_memories_ref = processing_memories_ref.where(filter=FieldFilter('status', 'in', statuses))
processing_memories_ref = processing_memories_ref.order_by('created_at', direction=firestore.Query.DESCENDING)
processing_memories_ref = processing_memories_ref.limit(limit)
return [doc.to_dict() for doc in processing_memories_ref.stream()]
+ try:
+ return [doc.to_dict() for doc in processing_memories_ref.stream()]
+ except Exception as e:
+ # Log the error and re-raise or handle it appropriately.
+ raise
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Files selected (6)
- app/lib/backend/http/api/processing_memories.dart (1)
- app/lib/backend/schema/memory.dart (4)
- app/lib/pages/memories/widgets/capture.dart (1)
- app/lib/pages/memories/widgets/processing_capture.dart (4)
- app/lib/providers/capture_provider.dart (5)
- backend/models/processing_memory.py (2)
Files skipped from review due to trivial changes (1)
- app/lib/pages/memories/widgets/capture.dart
Review comments generated (19)
- Review: 15
- LGTM: 4
Tips
Chat with Entelligence.AI Bot (@Entelligence.AI
)
- Reply on review comments left by this bot to ask follow-up questions. A review comment is a comment on a diff or a file.
- Invite the bot into a review comment chain by tagging
@Entelligence.AI
in a reply.
Code suggestions
- The bot may make code suggestions, but please review them carefully before committing since the line number ranges may be misaligned.
- You can edit the comment made by the bot and manually tweak the suggestion if it is slightly off.
Pausing incremental reviews
- Add
@Entelligence.AI: ignore
anywhere in the PR description to pause further reviews from the bot.
// Processing memory | ||
var provider = context.read<CaptureProvider>(); | ||
bool havingCapturingMemory = provider.capturingProcessingMemory != null; | ||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
from enum import Enum | ||
|
||
from pydantic import BaseModel | ||
|
||
from models.memory import Geolocation | ||
from models.transcript_segment import TranscriptSegment | ||
|
||
|
||
class ProcessingMemoryStatus(str, Enum): | ||
Capturing = 'capturing' | ||
Processing = 'processing' | ||
Done = 'done' |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The introduction of the ProcessingMemoryStatus
enum is a good practice as it provides a clear and consistent way to represent the different statuses of processing memories. However, it would be beneficial to add some comments explaining what each status represents for better maintainability.
+ # Enum to represent the different statuses of processing memories
class ProcessingMemoryStatus(str, Enum):
Capturing = 'capturing' # Memory is currently being captured
Processing = 'processing' # Memory capture is complete and is now being processed
Done = 'done' # Processing of memory is complete
class ProcessingMemory(BaseModel): | ||
id: str | ||
session_id: Optional[str] = None | ||
session_ids: List[str] = [] | ||
audio_url: Optional[str] = None | ||
created_at: datetime | ||
capturing_to: Optional[datetime] = None | ||
status: Optional[ProcessingMemoryStatus] = None |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
capturing_to: Optional[datetime] = None | ||
status: Optional[ProcessingMemoryStatus] = None |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
id: Optional[str] = None | ||
capturing_to: Optional[datetime] = None |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
app/lib/backend/schema/memory.dart
Outdated
class ServerProcessingMemory { | ||
final String id; | ||
final DateTime createdAt; | ||
final DateTime? startedAt; | ||
final DateTime? capturingTo; | ||
final ServerProcessingMemoryStatus? status; | ||
|
||
ServerProcessingMemory({ | ||
required this.id, | ||
required this.createdAt, | ||
this.startedAt, | ||
this.capturingTo, | ||
this.status, | ||
}); | ||
|
||
factory ServerProcessingMemory.fromJson(Map<String, dynamic> json) { | ||
return ServerProcessingMemory( | ||
id: json['id'], | ||
createdAt: DateTime.parse(json['created_at']).toLocal(), | ||
startedAt: json['started_at'] != null ? DateTime.parse(json['started_at']).toLocal() : null, | ||
capturingTo: json['capturing_to'] != null ? DateTime.parse(json['capturing_to']).toLocal() : null, | ||
status: json['status'] != null ? ServerProcessingMemoryStatus.valuesFromString(json['status']) : null, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
'capturing_to': capturingTo?.toUtc().toIso8601String(), | ||
'status': status.toString(), |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
|
||
// Fetch and watch capturing status | ||
ProcessingMemoryResponse? result = await fetchProcessingMemoryServer( | ||
id: processingMemoryId, | ||
); | ||
if (result?.result == null) { | ||
debugPrint("Can not fetch processing memory, result null"); | ||
} | ||
_setCapturingProcessingMemory(result?.result); | ||
|
||
// Update processing memory | ||
_updateProcessingMemory(); | ||
} | ||
|
||
void _setCapturingProcessingMemory(ServerProcessingMemory? pm) { | ||
debugPrint("${pm?.toJson()}"); | ||
if (pm != null && | ||
pm.status == ServerProcessingMemoryStatus.capturing && | ||
pm.capturingTo != null && | ||
pm.capturingTo!.isAfter(DateTime.now())) { | ||
capturingProcessingMemory = pm; | ||
|
||
// TODO: thinh, might set pre-segments | ||
} else { | ||
capturingProcessingMemory = null; | ||
} | ||
notifyListeners(); | ||
|
||
// End | ||
if (capturingProcessingMemory == null) { | ||
return; | ||
} | ||
|
||
// Or watch | ||
var id = capturingProcessingMemory!.id; | ||
var delayMs = capturingProcessingMemory?.capturingTo != null | ||
? capturingProcessingMemory!.capturingTo!.millisecondsSinceEpoch - DateTime.now().millisecondsSinceEpoch | ||
: 2 * 60 * 1000; // 2m | ||
if (delayMs > 0) { | ||
_processingMemoryWatchTimer?.cancel(); | ||
_processingMemoryWatchTimer = Timer(Duration(milliseconds: delayMs), () async { | ||
ProcessingMemoryResponse? result = await fetchProcessingMemoryServer(id: id); | ||
if (result?.result == null) { | ||
debugPrint("Can not fetch processing memory, result null"); | ||
return; | ||
} | ||
_setCapturingProcessingMemory(result?.result); | ||
}); | ||
} | ||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The _onProcessingMemoryCreated
method has been significantly expanded to include fetching and watching the capturing status of processing memory. However, there's a potential issue with the timer setup for watching status updates. The timer is set based on the capturingTo
field of the processing memory, but it doesn't account for possible delays in network communication or server processing. If the capturingTo
time is very close to the current time, the timer might fire after the capturing has already ended. To mitigate this, consider adding a buffer time to the delay.
- var delayMs = capturingProcessingMemory?.capturingTo != null
- ? capturingProcessingMemory!.capturingTo!.millisecondsSinceEpoch - DateTime.now().millisecondsSinceEpoch
- : 2 * 60 * 1000; // 2m
+ var delayMs = capturingProcessingMemory?.capturingTo != null
+ ? max(0, capturingProcessingMemory!.capturingTo!.millisecondsSinceEpoch - DateTime.now().millisecondsSinceEpoch - BUFFER_TIME_MS)
+ : 2 * 60 * 1000; // 2m
Where BUFFER_TIME_MS
is a constant representing the buffer time in milliseconds.
|
||
Future<ProcessingMemoryResponse?> fetchProcessingMemoryServer({ | ||
required String id, | ||
}) async { | ||
var response = await makeApiCall( | ||
url: '${Env.apiBaseUrl}v1/processing-memories/$id', | ||
headers: {}, | ||
method: 'GET', | ||
body: "", | ||
); | ||
if (response == null) return null; | ||
debugPrint('updateProcessingMemoryServer: ${response.body}'); | ||
if (response.statusCode == 200) { | ||
return ProcessingMemoryResponse.fromJson(jsonDecode(response.body)); | ||
} else { | ||
// TODO: Server returns 304 doesn't recover | ||
CrashReporting.reportHandledCrash( | ||
Exception('Failed to create memory'), | ||
StackTrace.current, | ||
level: NonFatalExceptionLevel.info, | ||
userAttributes: { | ||
'response': response.body, | ||
}, | ||
); | ||
} | ||
return null; | ||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The fetchProcessingMemoryServer
function is making an API call without any authentication headers. This could lead to unauthorized access to sensitive data. Please ensure that the necessary authorization headers are included in the API call.
Also, the error handling here seems to be incomplete. The code only handles the case where the status code is 200 and doesn't handle other potential HTTP status codes like 400 (Bad Request), 401 (Unauthorized), 403 (Forbidden), 404 (Not Found), etc. It's important to handle these cases to provide more specific error messages and potentially recover from errors when possible.
Future<ProcessingMemoryResponse?> fetchProcessingMemoryServer({
required String id,
}) async {
var response = await makeApiCall(
url: '${Env.apiBaseUrl}v1/processing-memories/$id',
- headers: {},
+ headers: {
+ 'Authorization': 'Bearer ${your_token_here}',
+ },
method: 'GET',
body: "",
);
if (response == null) return null;
debugPrint('updateProcessingMemoryServer: ${response.body}');
if (response.statusCode == 200) {
return ProcessingMemoryResponse.fromJson(jsonDecode(response.body));
} else if (response.statusCode == 400) {
// Handle bad request
} else if (response.statusCode == 401) {
// Handle unauthorized
} else if (response.statusCode == 403) {
// Handle forbidden
} else if (response.statusCode == 404) {
// Handle not found
} else {
// TODO: Server returns 304 doesn't recover
CrashReporting.reportHandledCrash(
Exception('Failed to create memory'),
StackTrace.current,
level: NonFatalExceptionLevel.info,
userAttributes: {
'response': response.body,
},
);
}
return null;
}
Please replace 'Bearer ${your_token_here}'
with the actual token.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Files selected (5)
- app/lib/backend/schema/memory.dart (4)
- app/lib/providers/capture_provider.dart (8)
- backend/models/processing_memory.py (2)
- backend/routers/processing_memories.py (2)
- backend/utils/processing_memories.py (3)
Review comments generated (21)
- Review: 15
- LGTM: 6
Tips
Chat with Entelligence.AI Bot (@Entelligence.AI
)
- Reply on review comments left by this bot to ask follow-up questions. A review comment is a comment on a diff or a file.
- Invite the bot into a review comment chain by tagging
@Entelligence.AI
in a reply.
Code suggestions
- The bot may make code suggestions, but please review them carefully before committing since the line number ranges may be misaligned.
- You can edit the comment made by the bot and manually tweak the suggestion if it is slightly off.
Pausing incremental reviews
- Add
@Entelligence.AI: ignore
anywhere in the PR description to pause further reviews from the bot.
backend/utils/processing_memories.py
Outdated
def get_processing_memory(uid: str, id: str, ) -> DetailProcessingMemory: | ||
processing_memory = processing_memories_db.get_processing_memory_by_id(uid, id) | ||
if not processing_memory: | ||
print("processing memory is not found") | ||
return | ||
processing_memory = DetailProcessingMemory(**processing_memory) | ||
|
||
return processing_memory | ||
|
||
def get_processing_memories(uid: str) -> [BasicProcessingMemory]: | ||
processing_memories = processing_memories_db.get_processing_memories(uid, statuses=[ProcessingMemoryStatus.Processing], limit=5) | ||
if not processing_memories or len(processing_memories) == 0: | ||
return [] | ||
|
||
return [BasicProcessingMemory(**processing_memory) for processing_memory in processing_memories] |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The function get_processing_memory
does not handle the case when a processing memory is not found. It simply prints a message and returns None. This could lead to unexpected behavior in the calling code if it expects a DetailProcessingMemory
object. Consider raising an exception instead.
- print("processing memory is not found")
- return
+ raise ValueError("Processing memory not found")
@router.get("/v1/processing-memories/{processing_memory_id}", response_model=DetailProcessingMemoryResponse, | ||
tags=['processing_memories']) | ||
def get_processing_memory( | ||
processing_memory_id: str, | ||
uid: str = Depends(auth.get_current_user_uid) | ||
): | ||
""" | ||
Get ProcessingMemory endpoint. | ||
:param processing_memory_id: | ||
:param uid: user id. | ||
:return: The processing_memory. | ||
""" | ||
|
||
update_processing_memory.id = processing_memory_id | ||
processing_memory = processing_memory_utils.get_processing_memory(uid, processing_memory_id) | ||
if not processing_memory: | ||
raise HTTPException(status_code=404, detail="Processing memory not found") | ||
|
||
return DetailProcessingMemoryResponse(result=processing_memory) | ||
|
||
@router.get("/v1/processing-memories", response_model=BasicProcessingMemoriesResponse, | ||
tags=['processing_memories']) | ||
def list_processing_memory(uid: str = Depends(auth.get_current_user_uid),): | ||
""" | ||
List ProcessingMemory endpoint. | ||
:param uid: user id. | ||
:return: The list of processing_memories. | ||
""" | ||
processing_memories = processing_memory_utils.get_processing_memories(uid) | ||
if not processing_memories or len(processing_memories) == 0: | ||
return BasicProcessingMemoriesResponse(result=[]) | ||
|
||
return BasicProcessingMemoriesResponse(result=list(processing_memories)) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
In the function get_processing_memory
, there seems to be a copy-paste error. The line update_processing_memory.id = processing_memory_id
should not be present in this function as it is not relevant here. Please remove this line.
- 47: update_processing_memory.id = processing_memory_id
Also, in the function list_processing_memory
, you are checking if processing_memories
is None or empty. If get_processing_memories
always returns a list (even an empty one), you can simplify the check by removing the not processing_memories
part.
- 63: if not processing_memories or len(processing_memories) == 0:
+ 63: if len(processing_memories) == 0:
app/lib/backend/schema/memory.dart
Outdated
class ServerProcessingMemory { | ||
final String id; | ||
final DateTime createdAt; | ||
final DateTime? startedAt; | ||
final DateTime? capturingTo; | ||
final ServerProcessingMemoryStatus? status; | ||
final List<TranscriptSegment> transcriptSegments; | ||
|
||
ServerProcessingMemory({ | ||
required this.id, | ||
required this.createdAt, | ||
this.startedAt, | ||
this.capturingTo, | ||
this.status, | ||
this.transcriptSegments = const [], | ||
}); | ||
|
||
factory ServerProcessingMemory.fromJson(Map<String, dynamic> json) { | ||
return ServerProcessingMemory( | ||
id: json['id'], | ||
createdAt: DateTime.parse(json['created_at']).toLocal(), | ||
startedAt: json['started_at'] != null ? DateTime.parse(json['started_at']).toLocal() : null, | ||
capturingTo: json['capturing_to'] != null ? DateTime.parse(json['capturing_to']).toLocal() : null, | ||
status: json['status'] != null ? ServerProcessingMemoryStatus.valuesFromString(json['status']) : null, | ||
transcriptSegments: ((json['transcript_segments'] ?? []) as List<dynamic>) | ||
.map((segment) => TranscriptSegment.fromJson(segment)) | ||
.toList(), |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The ServerProcessingMemory
class has been updated with new fields capturingTo
and status
. The fromJson
factory method has also been updated to parse these new fields from JSON. This is a good change as it allows the application to handle more detailed information about processing memories.
However, there's a potential issue with null safety. If the status
field in the JSON is null or contains an unrecognized value, ServerProcessingMemoryStatus.valuesFromString(json['status'])
will return null
or unknown
, which might not be the intended behavior. Consider throwing an error when the status is null or unrecognized to avoid silent failures.
- status: json['status'] != null ? ServerProcessingMemoryStatus.valuesFromString(json['status']) : null,
+ status: json['status'] != null ? ServerProcessingMemoryStatus.valuesFromString(json['status']) : throw SomeException("Invalid or missing status"),
'capturing_to': capturingTo?.toUtc().toIso8601String(), | ||
'status': status.toString(), | ||
'transcript_segments': transcriptSegments.map((segment) => segment.toJson()).toList(), |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
backend/models/processing_memory.py
Outdated
capturing_to: Optional[datetime] = None | ||
status: Optional[ProcessingMemoryStatus] = None | ||
geolocation: Optional[Geolocation] = None | ||
emotional_feedback: Optional[bool] = False | ||
|
||
memory_id: Optional[str] = None | ||
|
||
class DetailProcessingMemory(BaseModel): | ||
id: str | ||
timer_start: float | ||
created_at: datetime | ||
capturing_to: Optional[datetime] = None | ||
status: Optional[ProcessingMemoryStatus] = None | ||
geolocation: Optional[Geolocation] = None | ||
emotional_feedback: Optional[bool] = False | ||
transcript_segments: List[TranscriptSegment] = [] | ||
memory_id: Optional[str] = None | ||
|
||
|
||
class UpdateProcessingMemory(BaseModel): | ||
id: Optional[str] = None | ||
capturing_to: Optional[datetime] = None | ||
geolocation: Optional[Geolocation] = None | ||
emotional_feedback: Optional[bool] = False | ||
|
||
|
||
class UpdateProcessingMemoryResponse(BaseModel): | ||
result: BasicProcessingMemory | ||
|
||
class DetailProcessingMemoryResponse(BaseModel): | ||
result: DetailProcessingMemory | ||
|
||
|
||
class BasicProcessingMemoryResponse(BaseModel): | ||
result: BasicProcessingMemory | ||
|
||
class BasicProcessingMemoriesResponse(BaseModel): | ||
result: List[BasicProcessingMemory] |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The addition of capturing_to
and status
fields in other models (DetailProcessingMemory
, UpdateProcessingMemory
) and response models is consistent with the changes made to the ProcessingMemory
model. This will allow the frontend to receive and send data related to the capturing time and status of processing memories. However, make sure to update all relevant API endpoints and database queries to handle these new fields correctly.
class DetailProcessingMemory(BaseModel):
id: str
timer_start: float
created_at: datetime
+ capturing_to: Optional[datetime] = None
+ status: Optional[ProcessingMemoryStatus] = None
geolocation: Optional[Geolocation] = None
emotional_feedback: Optional[bool] = False
transcript_segments: List[TranscriptSegment] = []
memory_id: Optional[str] = None
class UpdateProcessingMemory(BaseModel):
id: Optional[str] = None
+ capturing_to: Optional[datetime] = None
geolocation: Optional[Geolocation] = None
emotional_feedback: Optional[bool] = False
class UpdateProcessingMemoryResponse(BaseModel):
+ result: BasicProcessingMemory
class DetailProcessingMemoryResponse(BaseModel):
+ result: DetailProcessingMemory
class BasicProcessingMemoryResponse(BaseModel):
+ result: BasicProcessingMemory
class BasicProcessingMemoriesResponse(BaseModel):
+ result: List[BasicProcessingMemory]
|
||
// Fetch and watch capturing status | ||
ProcessingMemoryResponse? result = await fetchProcessingMemoryServer( | ||
id: processingMemoryId, | ||
); | ||
if (result?.result == null) { | ||
debugPrint("Can not fetch processing memory, result null"); | ||
} | ||
_setCapturingProcessingMemory(result?.result); | ||
|
||
// Set pre-segements | ||
if (capturingProcessingMemory != null && (capturingProcessingMemory?.transcriptSegments ?? []).isNotEmpty) { | ||
List<TranscriptSegment> transcriptSegments = capturingProcessingMemory!.transcriptSegments; | ||
if (segments.isNotEmpty) { | ||
int idx = transcriptSegments.indexWhere((e) => e.end > segments.first.start) - 1; | ||
if (idx >= 0) { | ||
transcriptSegments = transcriptSegments.sublist(0, idx); | ||
} | ||
transcriptSegments.addAll(segments); | ||
} | ||
segments = transcriptSegments; | ||
setHasTranscripts(segments.isNotEmpty); | ||
} | ||
|
||
// Update processing memory | ||
_updateProcessingMemory(); | ||
} | ||
|
||
void _setCapturingProcessingMemory(ServerProcessingMemory? pm) { | ||
var now = DateTime.now(); | ||
debugPrint("${pm?.toJson()}"); | ||
if (pm != null && | ||
pm.status == ServerProcessingMemoryStatus.capturing && | ||
pm.capturingTo != null && | ||
pm.capturingTo!.isAfter(now)) { | ||
capturingProcessingMemory = pm; | ||
} else { | ||
capturingProcessingMemory = null; | ||
} | ||
notifyListeners(); | ||
|
||
// End | ||
if (capturingProcessingMemory == null) { | ||
return; | ||
} | ||
|
||
// Or watch | ||
var id = capturingProcessingMemory!.id; | ||
var delayMs = capturingProcessingMemory?.capturingTo != null | ||
? capturingProcessingMemory!.capturingTo!.millisecondsSinceEpoch - now.millisecondsSinceEpoch | ||
: 2 * 60 * 1000; // 2m | ||
if (delayMs > 0) { | ||
_processingMemoryWatchTimer?.cancel(); | ||
_processingMemoryWatchTimer = Timer(Duration(milliseconds: delayMs), () async { | ||
ProcessingMemoryResponse? result = await fetchProcessingMemoryServer(id: id); | ||
if (result?.result == null) { | ||
debugPrint("Can not fetch processing memory, result null"); | ||
return; | ||
} | ||
_setCapturingProcessingMemory(result?.result); | ||
}); | ||
} | ||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
There are a few potential issues in this section:
-
There is no error handling for the
fetchProcessingMemoryServer
function call. If it fails due to network issues or server errors, the application might crash or behave unexpectedly. -
The timer
_processingMemoryWatchTimer
is not cancelled before being reassigned, which could lead to memory leaks if it's already running. -
The delay calculation does not account for possible negative values. If
capturingProcessingMemory!.capturingTo!
is earlier thannow
,delayMs
will be negative and the Timer will execute immediately.
Here are some suggestions to address these issues:
ProcessingMemoryResponse? result;
try {
result = await fetchProcessingMemoryServer(
id: processingMemoryId,
);
} catch (e) {
debugPrint("Error fetching processing memory: $e");
return;
}
if (result?.result == null) {
debugPrint("Can not fetch processing memory, result null");
return;
}
_setCapturingProcessingMemory(result?.result);
_processingMemoryWatchTimer?.cancel();
var delayMs = max(0, capturingProcessingMemory?.capturingTo != null
? capturingProcessingMemory!.capturingTo!.millisecondsSinceEpoch - now.millisecondsSinceEpoch
: 2 * 60 * 1000); // 2m
} | ||
_socket?.subscribe(this, this); | ||
_transcriptServiceReady = true; | ||
|
||
if (segments.isNotEmpty) { | ||
// means that it was a reconnection, so we need to reset | ||
streamStartedAtSecond = null; | ||
secondsMissedOnReconnect = (DateTime.now().difference(firstStreamReceivedAt!).inSeconds); | ||
} | ||
} | ||
|
||
Future streamAudioToWs(String id, BleAudioCodec codec) async { | ||
debugPrint('streamAudioToWs in capture_provider'); | ||
audioStorage = WavBytesUtil(codec: codec); | ||
if (_bleBytesStream != null) { | ||
_bleBytesStream?.cancel(); | ||
} | ||
_bleBytesStream = await _getBleAudioBytesListener( | ||
id, | ||
onAudioBytesReceived: (List<int> value) { | ||
if (value.isEmpty) return; | ||
// audioStorage!.storeFramePacket(value); | ||
// print('audioStorage: ${audioStorage!.frames.length} ${audioStorage!.rawPackets.length}'); | ||
|
||
final trimmedValue = value.sublist(3); | ||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
// Wait reconnect | ||
if (capturingProcessingMemory == null) { | ||
_clean(); | ||
setMemoryCreating(false); | ||
setHasTranscripts(false); | ||
notifyListeners(); | ||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Files selected (12)
- app/lib/backend/http/api/memories.dart (1)
- app/lib/backend/schema/memory.dart (4)
- app/lib/pages/memories/page.dart (1)
- app/lib/pages/memories/widgets/processing_capture.dart (6)
- app/lib/pages/processing_memories/page.dart (1)
- app/lib/providers/capture_provider.dart (8)
- app/lib/providers/memory_provider.dart (4)
- backend/database/processing_memories.py (2)
- backend/models/processing_memory.py (2)
- backend/routers/processing_memories.py (2)
- backend/routers/transcribe.py (5)
- backend/utils/processing_memories.py (3)
Review comments generated (52)
- Review: 37
- LGTM: 15
Tips
Chat with Entelligence.AI Bot (@Entelligence.AI
)
- Reply on review comments left by this bot to ask follow-up questions. A review comment is a comment on a diff or a file.
- Invite the bot into a review comment chain by tagging
@Entelligence.AI
in a reply.
Code suggestions
- The bot may make code suggestions, but please review them carefully before committing since the line number ranges may be misaligned.
- You can edit the comment made by the bot and manually tweak the suggestion if it is slightly off.
Pausing incremental reviews
- Add
@Entelligence.AI: ignore
anywhere in the PR description to pause further reviews from the bot.
), | ||
); | ||
} | ||
|
||
// PROCESSING MEMORY | ||
|
||
class ProcessingMemoryWidget extends StatefulWidget { | ||
final ServerProcessingMemory memory; | ||
|
||
const ProcessingMemoryWidget({ | ||
super.key, | ||
required this.memory, | ||
}); | ||
@override | ||
State<ProcessingMemoryWidget> createState() => _ProcessingMemoryWidgetState(); | ||
} | ||
|
||
class _ProcessingMemoryWidgetState extends State<ProcessingMemoryWidget> { | ||
@override | ||
Widget build(BuildContext context) { | ||
return Consumer3<CaptureProvider, DeviceProvider, ConnectivityProvider>( | ||
builder: (context, provider, deviceProvider, connectivityProvider, child) { | ||
return GestureDetector( | ||
onTap: () async { | ||
if (widget.memory.transcriptSegments.isEmpty) return; | ||
routeToPage( | ||
context, | ||
ProcessingMemoryPage( | ||
memory: widget.memory, | ||
)); | ||
}, | ||
child: Container( | ||
margin: const EdgeInsets.symmetric(horizontal: 16), | ||
width: double.maxFinite, | ||
decoration: BoxDecoration( | ||
color: Colors.grey.shade900, | ||
borderRadius: BorderRadius.circular(16.0), | ||
), | ||
child: Padding( | ||
padding: const EdgeInsets.symmetric(horizontal: 16, vertical: 12), | ||
child: Column( | ||
mainAxisSize: MainAxisSize.min, | ||
crossAxisAlignment: CrossAxisAlignment.start, | ||
children: [ | ||
_getMemoryHeader(context), | ||
widget.memory.transcriptSegments.isNotEmpty | ||
? Column( | ||
children: [ | ||
const SizedBox(height: 8), | ||
getLiteTranscriptWidget( | ||
widget.memory.transcriptSegments, | ||
[], | ||
null, | ||
), | ||
const SizedBox(height: 8), | ||
], | ||
) | ||
: const SizedBox.shrink(), | ||
], | ||
), | ||
), | ||
)); | ||
}); | ||
} | ||
|
||
_getMemoryHeader(BuildContext context) { | ||
return Padding( | ||
padding: const EdgeInsets.only(left: 0, right: 12), | ||
child: Row( | ||
mainAxisAlignment: MainAxisAlignment.spaceBetween, | ||
children: [ | ||
Row( | ||
children: [ | ||
const SizedBox( | ||
width: 22, | ||
height: 22, | ||
child: CircularProgressIndicator( | ||
valueColor: AlwaysStoppedAnimation<Color>(Colors.white), | ||
), | ||
), | ||
const SizedBox(width: 12), | ||
Container( | ||
decoration: BoxDecoration( | ||
color: Colors.grey.shade800, | ||
borderRadius: BorderRadius.circular(16), | ||
), | ||
padding: const EdgeInsets.symmetric(horizontal: 12, vertical: 6), | ||
child: Text( | ||
'Processing...', | ||
style: Theme.of(context).textTheme.bodyMedium!.copyWith(color: Colors.white), | ||
maxLines: 1, | ||
), | ||
), | ||
], | ||
) | ||
], | ||
), | ||
); | ||
} | ||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This new widget ProcessingMemoryWidget
is introduced to handle the display of processing memories. The implementation looks fine at first glance, but it's crucial to ensure that it handles all possible states of a processing memory correctly. Also, consider adding error handling within this widget to gracefully handle any unexpected situations.
app/lib/backend/schema/memory.dart
Outdated
class ServerProcessingMemory { | ||
final String id; | ||
final DateTime createdAt; | ||
final DateTime? startedAt; | ||
final DateTime? capturingTo; | ||
final ServerProcessingMemoryStatus? status; | ||
final List<TranscriptSegment> transcriptSegments; | ||
final String? memoryId; | ||
|
||
ServerProcessingMemory({ | ||
required this.id, | ||
required this.createdAt, | ||
this.startedAt, | ||
this.capturingTo, | ||
this.status, | ||
this.transcriptSegments = const [], | ||
this.memoryId, | ||
}); | ||
|
||
factory ServerProcessingMemory.fromJson(Map<String, dynamic> json) { | ||
return ServerProcessingMemory( | ||
id: json['id'], | ||
createdAt: DateTime.parse(json['created_at']).toLocal(), | ||
startedAt: json['started_at'] != null ? DateTime.parse(json['started_at']).toLocal() : null, | ||
capturingTo: json['capturing_to'] != null ? DateTime.parse(json['capturing_to']).toLocal() : null, | ||
status: json['status'] != null ? ServerProcessingMemoryStatus.valuesFromString(json['status']) : null, | ||
transcriptSegments: ((json['transcript_segments'] ?? []) as List<dynamic>) | ||
.map((segment) => TranscriptSegment.fromJson(segment)) | ||
.toList(), | ||
memoryId: json['memoryId'], |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The ServerProcessingMemory
class has been updated with new fields. It's important to ensure that these new fields are properly initialized and handled throughout the application to avoid null reference errors or unexpected behavior. Also, when parsing dates from JSON, consider error handling for cases where the date string might not be in the expected format.
|
||
// Fetch and watch capturing status | ||
ProcessingMemoryResponse? result = await fetchProcessingMemoryServer( | ||
id: processingMemoryId, | ||
); | ||
if (result?.result == null) { | ||
debugPrint("Can not fetch processing memory, result null"); | ||
} | ||
_setCapturingProcessingMemory(result?.result); | ||
|
||
// Set pre-segements | ||
if (capturingProcessingMemory != null && (capturingProcessingMemory?.transcriptSegments ?? []).isNotEmpty) { | ||
segments = capturingProcessingMemory!.transcriptSegments; | ||
setHasTranscripts(segments.isNotEmpty); | ||
} | ||
|
||
// Update processing memory | ||
_updateProcessingMemory(); | ||
} | ||
|
||
void _setCapturingProcessingMemory(ServerProcessingMemory? pm) { | ||
var now = DateTime.now(); | ||
debugPrint("${pm?.toJson()}"); | ||
if (pm != null && | ||
pm.status == ServerProcessingMemoryStatus.capturing && | ||
pm.capturingTo != null && | ||
pm.capturingTo!.isAfter(now)) { | ||
capturingProcessingMemory = pm; | ||
} else { | ||
capturingProcessingMemory = null; | ||
} | ||
notifyListeners(); | ||
|
||
// End | ||
if (capturingProcessingMemory == null) { | ||
return; | ||
} | ||
|
||
// Or watch | ||
var id = capturingProcessingMemory!.id; | ||
var delayMs = capturingProcessingMemory?.capturingTo != null | ||
? capturingProcessingMemory!.capturingTo!.millisecondsSinceEpoch - now.millisecondsSinceEpoch | ||
: 2 * 60 * 1000; // 2m | ||
if (delayMs > 0) { | ||
_processingMemoryWatchTimer?.cancel(); | ||
_processingMemoryWatchTimer = Timer(Duration(milliseconds: delayMs), () async { | ||
ProcessingMemoryResponse? result = await fetchProcessingMemoryServer(id: id); | ||
if (result?.result == null) { | ||
debugPrint("Can not fetch processing memory, result null"); | ||
return; | ||
} | ||
_setCapturingProcessingMemory(result?.result); | ||
}); | ||
} | ||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
There seems to be a lack of error handling in this section. If the fetchProcessingMemoryServer
function fails for any reason (network issues, server errors, etc.), it will throw an exception that is not currently being caught. This could lead to unexpected behavior or crashes. Consider adding a try-catch block around the network request and handle the potential exceptions appropriately.
+ try {
ProcessingMemoryResponse? result = await fetchProcessingMemoryServer(
id: processingMemoryId,
);
+ } catch (e) {
+ debugPrint("Error fetching processing memory: $e");
+ // Handle the error appropriately here
+ }
Future<List<ServerProcessingMemory>> getProcessingMemories( | ||
{List<String> filterIds = const [], int limit = 5, int offset = 0}) async { | ||
var response = await makeApiCall( | ||
url: '${Env.apiBaseUrl}v1/processing-memories?filter_ids=${filterIds.join(",")}limit=$limit&offset=$offset', | ||
headers: {}, | ||
method: 'GET', | ||
body: ''); | ||
if (response == null) return []; | ||
if (response.statusCode == 200) { | ||
// decode body bytes to utf8 string and then parse json so as to avoid utf8 char issues | ||
var body = utf8.decode(response.bodyBytes); | ||
var memories = | ||
(jsonDecode(body)["result"] as List<dynamic>).map((memory) => ServerProcessingMemory.fromJson(memory)).toList(); | ||
debugPrint('getProcessingMemories length: ${memories.length}'); | ||
return memories; | ||
} | ||
return []; | ||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The getProcessingMemories
function is missing error handling for non-200 HTTP status codes. If the server returns an error, this function will simply return an empty list, which could be misleading to the caller. It would be better to throw an exception with a meaningful message in case of an error response from the server.
if (response.statusCode == 200) {
// decode body bytes to utf8 string and then parse json so as to avoid utf8 char issues
var body = utf8.decode(response.bodyBytes);
var memories =
(jsonDecode(body)["result"] as List<dynamic>).map((memory) => ServerProcessingMemory.fromJson(memory)).toList();
debugPrint('getProcessingMemories length: ${memories.length}');
return memories;
+ } else {
+ throw Exception('Failed to load processing memories: ${response.statusCode} ${response.reasonPhrase}');
}
return [];
Future<List<ServerProcessingMemory>> getProcessingMemories( | ||
{List<String> filterIds = const [], int limit = 5, int offset = 0}) async { | ||
var response = await makeApiCall( | ||
url: '${Env.apiBaseUrl}v1/processing-memories?filter_ids=${filterIds.join(",")}limit=$limit&offset=$offset', |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
There seems to be a missing '&' before 'limit' in the URL parameters. This could lead to incorrect API calls.
- url: '${Env.apiBaseUrl}v1/processing-memories?filter_ids=${filterIds.join(",")}limit=$limit&offset=$offset',
+ url: '${Env.apiBaseUrl}v1/processing-memories?filter_ids=${filterIds.join(",")}&limit=$limit&offset=$offset',
class UpdateProcessingMemoryResponse(BaseModel): | ||
result: BasicProcessingMemory | ||
|
||
class DetailProcessingMemoryResponse(BaseModel): | ||
result: DetailProcessingMemory | ||
|
||
|
||
class DetailProcessingMemoriesResponse(BaseModel): | ||
result: List[DetailProcessingMemory] | ||
|
||
|
||
class BasicProcessingMemoryResponse(BaseModel): | ||
result: BasicProcessingMemory | ||
|
||
|
||
class BasicProcessingMemoriesResponse(BaseModel): | ||
result: List[BasicProcessingMemory] |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The new response models UpdateProcessingMemoryResponse
, DetailProcessingMemoryResponse
, DetailProcessingMemoriesResponse
, BasicProcessingMemoryResponse
, and BasicProcessingMemoriesResponse
are well structured and should provide a consistent interface for the frontend to interact with. Good job on this!
WidgetsBinding.instance.addPostFrameCallback((_) async { | ||
await Navigator.of(context).pushReplacement(MaterialPageRoute( | ||
builder: (c) => MemoryDetailPage( | ||
memory: memory, | ||
), | ||
)); | ||
}); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The use of WidgetsBinding.instance.addPostFrameCallback
to navigate to a new page might not be the best approach. This could lead to unexpected behavior if the widget is removed from the tree before the callback is executed. Consider using Future.microtask
instead, which will ensure that the navigation happens in the next microtask after the current stack of tasks have completed.
- WidgetsBinding.instance.addPostFrameCallback((_) async {
- await Navigator.of(context).pushReplacement(MaterialPageRoute(
- builder: (c) => MemoryDetailPage(
- memory: memory,
- ),
- ));
- });
+ Future.microtask(() async {
+ await Navigator.of(context).pushReplacement(
+ MaterialPageRoute(
+ builder: (c) => MemoryDetailPage(
+ memory: memory,
+ ),
+ ),
+ );
+ });
var memorySource = MemorySource.friend; | ||
var captureProvider = context.read<CaptureProvider>(); | ||
if (captureProvider.isGlasses) { | ||
memorySource = MemorySource.openglass; | ||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The memorySource
variable is being set based on the value of captureProvider.isGlasses
. However, it doesn't account for the case where isGlasses
is false. In such a case, memorySource
will always be MemorySource.friend
, which might not be accurate. Consider adding an else clause to handle this scenario.
- var memorySource = MemorySource.friend;
- var captureProvider = context.read<CaptureProvider>();
- if (captureProvider.isGlasses) {
- memorySource = MemorySource.openglass;
- }
+ var captureProvider = context.read<CaptureProvider>();
+ var memorySource = captureProvider.isGlasses ? MemorySource.openglass : MemorySource.friend;
widget.memory.transcriptSegments.isEmpty | ||
? Column( | ||
children: [ | ||
const SizedBox(height: 80), | ||
Center( | ||
child: Text(memorySource == MemorySource.friend ? "No transcript" : "Empty")), | ||
], | ||
) | ||
: getTranscriptWidget(false, widget.memory.transcriptSegments, [], null) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This code block checks if widget.memory.transcriptSegments
is empty and displays a "No transcript" or "Empty" message accordingly. However, it doesn't handle the case where getTranscriptWidget
returns null. This could potentially lead to runtime errors. You should add a null check before using the result of getTranscriptWidget
.
- widget.memory.transcriptSegments.isEmpty
- ? Column(
- children: [
- const SizedBox(height: 80),
- Center(
- child: Text(memorySource == MemorySource.friend ? "No transcript" : "Empty")),
- ],
- )
- : getTranscriptWidget(false, widget.memory.transcriptSegments, [], null)
+ widget.memory.transcriptSegments.isEmpty
+ ? Column(
+ children: [
+ const SizedBox(height: 80),
+ Center(
+ child: Text(memorySource == MemorySource.friend ? "No transcript" : "Empty"),
+ ),
+ ],
+ )
+ : getTranscriptWidget(false, widget.memory.transcriptSegments, [], null) ?? Container()
widget.memory.transcriptSegments.isEmpty ? "No summary" : "Processing", | ||
textAlign: TextAlign.center, | ||
style: const TextStyle(fontSize: 16), |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The ternary operator is used to display either "No summary" or "Processing". However, it doesn't account for the case where widget.memory.transcriptSegments
is not empty but the processing is still not done. Consider adding a condition to handle this scenario.
- child: Text(
- widget.memory.transcriptSegments.isEmpty ? "No summary" : "Processing",
- textAlign: TextAlign.center,
- style: const TextStyle(fontSize: 16),
- ),
+ child: Text(
+ widget.memory.transcriptSegments.isEmpty
+ ? "No summary"
+ : widget.memory.status == ServerProcessingMemoryStatus.done
+ ? "Done"
+ : "Processing",
+ textAlign: TextAlign.center,
+ style: const TextStyle(fontSize: 16),
+ ),
f4b15bc
to
8f761c8
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Files selected (20)
- app/lib/backend/http/api/memories.dart (1)
- app/lib/backend/http/api/processing_memories.dart (2)
- app/lib/backend/schema/memory.dart (5)
- app/lib/backend/schema/message_event.dart (3)
- app/lib/main.dart (1)
- app/lib/pages/memories/page.dart (1)
- app/lib/pages/memories/widgets/capture.dart (1)
- app/lib/pages/memories/widgets/memory_list_item.dart (1)
- app/lib/pages/memories/widgets/processing_capture.dart (6)
- app/lib/pages/processing_memories/page.dart (1)
- app/lib/providers/capture_provider.dart (10)
- app/lib/providers/memory_provider.dart (5)
- app/lib/services/sockets/pure_socket.dart (2)
- app/lib/services/sockets/transcription_connection.dart (2)
- backend/database/processing_memories.py (2)
- backend/models/message_event.py (1)
- backend/models/processing_memory.py (2)
- backend/routers/processing_memories.py (2)
- backend/routers/transcribe.py (5)
- backend/utils/processing_memories.py (3)
Files skipped from review due to trivial changes (3)
- app/lib/main.dart
- app/lib/pages/memories/widgets/capture.dart
- app/lib/services/sockets/transcription_connection.dart
Review comments generated (63)
- Review: 36
- LGTM: 27
Tips
Chat with Entelligence.AI Bot (@Entelligence.AI
)
- Reply on review comments left by this bot to ask follow-up questions. A review comment is a comment on a diff or a file.
- Invite the bot into a review comment chain by tagging
@Entelligence.AI
in a reply.
Code suggestions
- The bot may make code suggestions, but please review them carefully before committing since the line number ranges may be misaligned.
- You can edit the comment made by the bot and manually tweak the suggestion if it is slightly off.
Pausing incremental reviews
- Add
@Entelligence.AI: ignore
anywhere in the PR description to pause further reviews from the bot.
@@ -49,6 +49,7 @@ class _MemoriesPageState extends State<MemoriesPage> with AutomaticKeepAliveClie | |||
const SliverToBoxAdapter(child: SizedBox(height: 32)), | |||
const SliverToBoxAdapter(child: SpeechProfileCardWidget()), | |||
SliverToBoxAdapter(child: getMemoryCaptureWidget()), | |||
getProcessingMemoriesWidget(memoryProvider.processingMemories), |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The getProcessingMemoriesWidget
function is being called with an argument, but it's not clear if this function can handle null values. If the processingMemories
list in memoryProvider
is null, this could potentially lead to a runtime error. Consider adding null safety checks either within the getProcessingMemoriesWidget
function or before calling it.
- getProcessingMemoriesWidget(memoryProvider.processingMemories),
+ getProcessingMemoriesWidget(memoryProvider.processingMemories ?? []),
This will ensure that an empty list is passed to the function if processingMemories
is null, preventing potential null dereference errors.
var response = await makeApiCall( | ||
url: '${Env.apiBaseUrl}v1/processing-memories/$id', | ||
headers: {}, | ||
method: 'PATCH', | ||
body: jsonEncode(bodyData), | ||
); | ||
if (response == null) return null; | ||
debugPrint('updateProcessingMemoryServer: ${response.body}'); | ||
if (response.statusCode == 200) { | ||
return UpdateProcessingMemoryResponse.fromJson(jsonDecode(response.body)); | ||
} else { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The makeApiCall
function is called with an empty headers object. If this API requires authentication, you should include the necessary authorization headers. Also, error handling seems to be missing for non-200 status codes. It would be beneficial to add some form of error logging or reporting in case the API call fails.
- headers: {},
+ headers: {
+ 'Authorization': 'Bearer $token',
+ },
- } else {
+ } else {
+ // Log or report error
+ print('Error updating processing memory: ${response.statusCode}');
+ }
var response = await makeApiCall( | ||
url: '${Env.apiBaseUrl}v1/processing-memories/$id', | ||
headers: {}, | ||
method: 'GET', | ||
body: "", | ||
); | ||
if (response == null) return null; | ||
if (response.statusCode == 200) { | ||
return ProcessingMemoryResponse.fromJson(jsonDecode(response.body)); | ||
} else { | ||
// TODO: Server returns 304 doesn't recover | ||
CrashReporting.reportHandledCrash( | ||
Exception('Failed to create memory'), | ||
StackTrace.current, | ||
level: NonFatalExceptionLevel.info, | ||
userAttributes: { | ||
'response': response.body, | ||
}, | ||
); | ||
} | ||
return null; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Similar to the previous comment, the makeApiCall
function is called with an empty headers object. If this API requires authentication, you should include the necessary authorization headers. Additionally, the body parameter for a GET request is typically not used and can be omitted. Lastly, the exception message at line 62 says 'Failed to create memory', which seems incorrect as this function is fetching a memory, not creating one.
- headers: {},
+ headers: {
+ 'Authorization': 'Bearer $token',
+ },
- body: "",
- Exception('Failed to create memory'),
+ Exception('Failed to fetch memory'),
deviceProvider.connectedDevice == null && | ||
!deviceProvider.isConnecting && | ||
!isUsingPhoneMic && | ||
havingCapturingMemory | ||
? Row( | ||
children: [ | ||
const Text( | ||
'🎙️', | ||
style: TextStyle(color: Colors.white, fontSize: 22, fontWeight: FontWeight.w600), | ||
), | ||
const SizedBox(width: 12), | ||
Container( | ||
decoration: BoxDecoration( | ||
color: Colors.grey.shade800, | ||
borderRadius: BorderRadius.circular(16), | ||
), | ||
padding: const EdgeInsets.symmetric(horizontal: 12, vertical: 6), | ||
child: Text( | ||
'Waiting for reconnect...', | ||
style: Theme.of(context).textTheme.bodyMedium!.copyWith(color: Colors.white), | ||
maxLines: 1, | ||
), | ||
), | ||
], | ||
) | ||
: const SizedBox.shrink(), | ||
// mic | ||
// TODO: improve phone recording UI | ||
deviceProvider.connectedDevice == null && !deviceProvider.isConnecting | ||
deviceProvider.connectedDevice == null && !deviceProvider.isConnecting && !havingCapturingMemory |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The logic here seems to be a bit complex and hard to understand. It might be beneficial to break it down into smaller, more manageable functions or use well-named variables to make the code more readable.
- deviceProvider.connectedDevice == null &&
- !deviceProvider.isConnecting &&
- !isUsingPhoneMic &&
- havingCapturingMemory
+ bool isDeviceDisconnected = deviceProvider.connectedDevice == null;
+ bool isNotConnecting = !deviceProvider.isConnecting;
+ bool isNotUsingPhoneMic = !isUsingPhoneMic;
+
+ if (isDeviceDisconnected && isNotConnecting && isNotUsingPhoneMic && havingCapturingMemory) {
), | ||
); | ||
} | ||
|
||
// PROCESSING MEMORY | ||
|
||
class ProcessingMemoryWidget extends StatefulWidget { | ||
final ServerProcessingMemory memory; | ||
|
||
const ProcessingMemoryWidget({ | ||
super.key, | ||
required this.memory, | ||
}); | ||
@override | ||
State<ProcessingMemoryWidget> createState() => _ProcessingMemoryWidgetState(); | ||
} | ||
|
||
class _ProcessingMemoryWidgetState extends State<ProcessingMemoryWidget> { | ||
@override | ||
Widget build(BuildContext context) { | ||
return Consumer3<CaptureProvider, DeviceProvider, ConnectivityProvider>( | ||
builder: (context, provider, deviceProvider, connectivityProvider, child) { | ||
return GestureDetector( | ||
onTap: () async { | ||
if (widget.memory.transcriptSegments.isEmpty) return; | ||
routeToPage( | ||
context, | ||
ProcessingMemoryPage( | ||
memory: widget.memory, | ||
)); | ||
}, | ||
child: Container( | ||
margin: const EdgeInsets.symmetric(horizontal: 16), | ||
width: double.maxFinite, | ||
decoration: BoxDecoration( | ||
color: Colors.grey.shade900, | ||
borderRadius: BorderRadius.circular(16.0), | ||
), | ||
child: Padding( | ||
padding: const EdgeInsets.symmetric(horizontal: 16, vertical: 12), | ||
child: Column( | ||
mainAxisSize: MainAxisSize.min, | ||
crossAxisAlignment: CrossAxisAlignment.start, | ||
children: [ | ||
_getMemoryHeader(context), | ||
widget.memory.transcriptSegments.isNotEmpty | ||
? Column( | ||
children: [ | ||
const SizedBox(height: 8), | ||
getLiteTranscriptWidget( | ||
widget.memory.transcriptSegments, | ||
[], | ||
null, | ||
), | ||
const SizedBox(height: 8), | ||
], | ||
) | ||
: const SizedBox.shrink(), | ||
], | ||
), | ||
), | ||
)); | ||
}); | ||
} | ||
|
||
_getMemoryHeader(BuildContext context) { | ||
return Padding( | ||
padding: const EdgeInsets.only(left: 0, right: 12), | ||
child: Row( | ||
mainAxisAlignment: MainAxisAlignment.spaceBetween, | ||
children: [ | ||
Row( | ||
children: [ | ||
const SizedBox( | ||
width: 22, | ||
height: 22, | ||
child: CircularProgressIndicator( | ||
valueColor: AlwaysStoppedAnimation<Color>(Colors.white), | ||
), | ||
), | ||
const SizedBox(width: 20), | ||
Container( | ||
decoration: BoxDecoration( | ||
color: Colors.grey.shade800, | ||
borderRadius: BorderRadius.circular(16), | ||
), | ||
padding: const EdgeInsets.symmetric(horizontal: 12, vertical: 6), | ||
child: Text( | ||
'Processing', | ||
style: Theme.of(context).textTheme.bodyMedium!.copyWith(color: Colors.white), | ||
maxLines: 1, | ||
), | ||
), | ||
], | ||
) | ||
], | ||
), | ||
); | ||
} | ||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This new widget ProcessingMemoryWidget
looks good overall. However, there's a potential issue with the getProcessingMemoriesWidget
function. If the status of a processing memory changes from 'processing' to 'done', the widget for that memory will not be removed from the list. This could lead to confusion for users as they may see memories in the list that have already been processed. Consider adding a mechanism to remove or update widgets when their corresponding memories' statuses change.
- if (pm.status == ServerProcessingMemoryStatus.processing) {
- return ProcessingMemoryWidget(memory: pm);
- }
- if (pm.status == ServerProcessingMemoryStatus.done) {
- return const SizedBox.shrink();
- }
+ return pm.status == ServerProcessingMemoryStatus.processing
+ ? ProcessingMemoryWidget(memory: pm)
+ : const SizedBox.shrink();
enum ServerProcessingMemoryStatus { | ||
capturing('capturing'), | ||
processing('processing'), | ||
done('done'), | ||
unknown('unknown'), | ||
; | ||
|
||
final String value; | ||
const ServerProcessingMemoryStatus(this.value); | ||
|
||
static ServerProcessingMemoryStatus valuesFromString(String value) { | ||
return ServerProcessingMemoryStatus.values.firstWhereOrNull((e) => e.value == value) ?? | ||
ServerProcessingMemoryStatus.unknown; | ||
} | ||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The ServerProcessingMemoryStatus
enum has been introduced with a method valuesFromString
. This method returns unknown
if the provided string doesn't match any of the enum values. This could potentially hide errors if an unexpected status value is received from the server. It might be better to throw an error or at least log a warning in such cases.
static ServerProcessingMemoryStatus valuesFromString(String value) {
return ServerProcessingMemoryStatus.values.firstWhereOrNull((e) => e.value == value) ??
- 73: ServerProcessingMemoryStatus.unknown;
+ 73: (throw ArgumentError('Unknown ServerProcessingMemoryStatus: $value'));
}
class ServerProcessingMemory { | ||
final String id; | ||
final DateTime createdAt; | ||
final DateTime? startedAt; | ||
final DateTime? capturingTo; | ||
final ServerProcessingMemoryStatus? status; | ||
final List<TranscriptSegment> transcriptSegments; | ||
final String? memoryId; | ||
|
||
ServerProcessingMemory({ | ||
required this.id, | ||
required this.createdAt, | ||
this.startedAt, | ||
this.capturingTo, | ||
this.status, | ||
this.transcriptSegments = const [], | ||
this.memoryId, | ||
}); | ||
|
||
factory ServerProcessingMemory.fromJson(Map<String, dynamic> json) { | ||
return ServerProcessingMemory( | ||
id: json['id'], | ||
createdAt: DateTime.parse(json['created_at']).toLocal(), | ||
startedAt: json['started_at'] != null ? DateTime.parse(json['started_at']).toLocal() : null, | ||
capturingTo: json['capturing_to'] != null ? DateTime.parse(json['capturing_to']).toLocal() : null, | ||
status: json['status'] != null ? ServerProcessingMemoryStatus.valuesFromString(json['status']) : null, | ||
transcriptSegments: ((json['transcript_segments'] ?? []) as List<dynamic>) | ||
.map((segment) => TranscriptSegment.fromJson(segment)) | ||
.toList(), | ||
memoryId: json['memory_id'], |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The ServerProcessingMemory
class has been updated with new fields capturingTo
, status
, and memoryId
. The fromJson
factory method has also been updated accordingly. However, there is no null safety check for json['id']
and json['created_at']
. If these fields are not present in the JSON object, the application will crash. Consider adding null checks for these fields.
factory ServerProcessingMemory.fromJson(Map<String, dynamic> json) {
- 97: return ServerProcessingMemory(
- 98: id: json['id'],
- 99: createdAt: DateTime.parse(json['created_at']).toLocal(),
+ 97: if (json['id'] != null && json['created_at'] != null) {
+ 98: return ServerProcessingMemory(
+ 99: id: json['id'],
+ 100: createdAt: DateTime.parse(json['created_at']).toLocal(),
startedAt: json['started_at'] != null ? DateTime.parse(json['started_at']).toLocal() : null,
capturingTo: json['capturing_to'] != null ? DateTime.parse(json['capturing_to']).toLocal() : null,
status: json['status'] != null ? ServerProcessingMemoryStatus.valuesFromString(json['status']) : null,
transcriptSegments: ((json['transcript_segments'] ?? []) as List<dynamic>)
.map((segment) => TranscriptSegment.fromJson(segment))
.toList(),
memoryId: json['memory_id'],
+ 107: );
+ 108: } else {
+ 109: throw ArgumentError('Missing required fields in JSON');
+ 110: }
111:}
'capturing_to': capturingTo?.toUtc().toIso8601String(), | ||
'status': status.toString(), | ||
'transcript_segments': transcriptSegments.map((segment) => segment.toJson()).toList(), | ||
'memory_id': memoryId, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The toJson
method has been updated to include the new fields. However, for the status
field, it's directly calling toString()
which will return the full name of the enum including the class name (e.g., ServerProcessingMemoryStatus.capturing
). Consider using split('.').last
to get only the value part of the enum.
- 'status': status.toString(),
+ 'status': status.toString().split('.').last,
class ProcessingMemoryResponse { | ||
final ServerProcessingMemory? result; | ||
|
||
ProcessingMemoryResponse({required this.result}); | ||
|
||
factory ProcessingMemoryResponse.fromJson(Map<String, dynamic> json) { | ||
return ProcessingMemoryResponse( | ||
result: json['result'] != null ? ServerProcessingMemory.fromJson(json['result']) : null, | ||
); | ||
} | ||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
A new class ProcessingMemoryResponse
has been introduced with a factory method fromJson
. This method assumes that if json['result']
is not null, it can be parsed into a ServerProcessingMemory
object. If the structure of json['result']
is not as expected, this could lead to runtime errors. Consider adding error handling or validation here.
factory ProcessingMemoryResponse.fromJson(Map<String, dynamic> json) {
- 141: return ProcessingMemoryResponse(
- 142: result: json['result'] != null ? ServerProcessingMemory.fromJson(json['result']) : null,
+ 141: if (json['result'] != null) {
+ 142: try {
+ 143: return ProcessingMemoryResponse(
+ 144: result: ServerProcessingMemory.fromJson(json['result']),
+ 145: );
+ 146: } catch (e) {
+ 147: throw FormatException('Invalid format for ServerProcessingMemory');
+ 148: }
+ 149: } else {
+ 150: return ProcessingMemoryResponse(result: null);
+ 151: }
}
// local label | ||
bool isNew = false; | ||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
…mory, performance issue on memory provider
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Files selected (6)
- app/lib/pages/memories/widgets/memory_list_item.dart (2)
- app/lib/pages/memories/widgets/processing_capture.dart (6)
- app/lib/providers/capture_provider.dart (11)
- app/lib/providers/memory_provider.dart (6)
- backend/models/message_event.py (1)
- backend/routers/transcribe.py (6)
Review comments generated (41)
- Review: 18
- LGTM: 23
Tips
Chat with Entelligence.AI Bot (@Entelligence.AI
)
- Reply on review comments left by this bot to ask follow-up questions. A review comment is a comment on a diff or a file.
- Invite the bot into a review comment chain by tagging
@Entelligence.AI
in a reply.
Code suggestions
- The bot may make code suggestions, but please review them carefully before committing since the line number ranges may be misaligned.
- You can edit the comment made by the bot and manually tweak the suggestion if it is slightly off.
Pausing incremental reviews
- Add
@Entelligence.AI: ignore
anywhere in the PR description to pause further reviews from the bot.
@@ -34,6 +34,7 @@ class _MemoryListItemState extends State<MemoryListItem> { | |||
onTap: () async { | |||
MixpanelManager().memoryListItemClicked(widget.memory, widget.memoryIdx); | |||
context.read<MemoryDetailProvider>().updateMemory(widget.memoryIdx, widget.date); | |||
Provider.of<MemoryProvider>(context, listen: false).onMemoryTap(widget.memoryIdx); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The Provider.of<MemoryProvider>(context, listen: false).onMemoryTap(widget.memoryIdx);
line is added to handle the memory tap event. However, it's not clear if this method has any side effects or if it returns a Future. If it does, you should consider awaiting it to ensure that any potential exceptions are properly handled and that the operation completes before moving on.
- Provider.of<MemoryProvider>(context, listen: false).onMemoryTap(widget.memoryIdx);
+ await Provider.of<MemoryProvider>(context, listen: false).onMemoryTap(widget.memoryIdx);
decoration: widget.memory.isNew | ||
? BoxDecoration( | ||
color: Colors.grey.shade900, | ||
borderRadius: BorderRadius.circular(16.0), | ||
border: Border.all( | ||
color: Colors.lightBlue, | ||
width: 1, | ||
)) | ||
: BoxDecoration( | ||
color: Colors.grey.shade900, | ||
borderRadius: BorderRadius.circular(16.0), | ||
), |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This change introduces a conditional decoration for the widget based on whether the memory is new or not. It seems fine from a logic perspective, but make sure that the UI changes have been thoroughly tested across different screen sizes and resolutions to ensure that they look as expected.
memory_id: Optional[str] = None | ||
|
||
def to_json(self): | ||
j = self.model_dump(mode="json") | ||
j["type"] = self.event_type | ||
return j | ||
|
||
class ProcessingMemoryStatusChanged(MessageEvent): | ||
processing_memory_id: Optional[str] = None | ||
processing_memory_status: Optional[str] = None | ||
memory_id: Optional[str] = None |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The new class ProcessingMemoryStatusChanged
inherits from MessageEvent
and introduces three optional attributes: processing_memory_id
, processing_memory_status
, and memory_id
. However, the to_json
method is not properly indented under this class. This could lead to unexpected behavior or errors as it's currently defined in the global scope instead of being a method of the class. Here's how you can fix it:
class ProcessingMemoryStatusChanged(MessageEvent):
processing_memory_id: Optional[str] = None
processing_memory_status: Optional[str] = None
memory_id: Optional[str] = None
-
+ def to_json(self):
- def to_json(self):
j = self.model_dump(mode="json")
This change ensures that the to_json
method is correctly defined within the ProcessingMemoryStatusChanged
class.
bool havingCapturingMemory = provider.capturingProcessingMemory != null; | ||
|
||
return (showPhoneMic || isConnected) | ||
return (showPhoneMic || isConnected || havingCapturingMemory) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The new condition havingCapturingMemory
has been added to the return statement. This seems to be a logical addition, but it's important to ensure that this doesn't introduce any unexpected behavior in the UI. If the capturing memory is not null, the GestureDetector widget will be returned even if the device is not connected and the phone mic is not being used. Make sure this is the intended behavior.
deviceProvider.connectedDevice == null && | ||
!isUsingPhoneMic && | ||
havingCapturingMemory | ||
? Row( | ||
children: [ | ||
const Text( | ||
'🎙️', | ||
style: TextStyle(color: Colors.white, fontSize: 22, fontWeight: FontWeight.w600), | ||
), | ||
const SizedBox(width: 12), | ||
Container( | ||
decoration: BoxDecoration( | ||
color: Colors.grey.shade800, | ||
borderRadius: BorderRadius.circular(16), | ||
), | ||
padding: const EdgeInsets.symmetric(horizontal: 12, vertical: 6), | ||
child: Text( | ||
'Waiting for reconnect...', | ||
style: Theme.of(context).textTheme.bodyMedium!.copyWith(color: Colors.white), | ||
maxLines: 1, | ||
), | ||
), | ||
], | ||
) | ||
: const SizedBox.shrink(), | ||
// mic | ||
// TODO: improve phone recording UI | ||
deviceProvider.connectedDevice == null && !deviceProvider.isConnecting | ||
deviceProvider.connectedDevice == null && !deviceProvider.isConnecting && !havingCapturingMemory |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This block of code introduces a new UI element when deviceProvider.connectedDevice == null && !isUsingPhoneMic && havingCapturingMemory
. It's crucial to verify that this UI change aligns with the overall design and user experience of your application. Also, consider extracting this UI logic into a separate function or widget for better readability and maintainability.
Future<void> _onProcessingMemoryStatusChanged(String processingMemoryId, ServerProcessingMemoryStatus status) async { | ||
if (capturingProcessingMemory == null || capturingProcessingMemory?.id != processingMemoryId) { | ||
debugPrint("Warn: Didn't track processing memory yet $processingMemoryId"); | ||
} | ||
|
||
ProcessingMemoryResponse? result = await fetchProcessingMemoryServer(id: processingMemoryId); | ||
if (result?.result == null) { | ||
debugPrint("Can not fetch processing memory, result null"); | ||
return; | ||
} | ||
var pm = result!.result!; | ||
if (status == ServerProcessingMemoryStatus.processing) { | ||
memoryProvider?.onNewProcessingMemory(pm); | ||
return; | ||
} | ||
if (status == ServerProcessingMemoryStatus.done) { | ||
memoryProvider?.onProcessingMemoryDone(pm); | ||
return; | ||
} | ||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The _onProcessingMemoryStatusChanged
function seems to be handling different statuses of processing memory. However, it doesn't handle all possible cases of ServerProcessingMemoryStatus
. It's important to consider what should happen if the status is not processing
or done
. If there are other possible statuses, you might want to add error handling or logging for those cases.
Future<void> _onProcessingMemoryCreated(String processingMemoryId) async { | ||
this.processingMemoryId = processingMemoryId; | ||
|
||
// Fetch and watch capturing status | ||
ProcessingMemoryResponse? result = await fetchProcessingMemoryServer( | ||
id: processingMemoryId, | ||
); | ||
if (result?.result == null) { | ||
debugPrint("Can not fetch processing memory, result null"); | ||
} | ||
_setCapturingProcessingMemory(result?.result); | ||
|
||
// Set pre-segements | ||
if (capturingProcessingMemory != null && (capturingProcessingMemory?.transcriptSegments ?? []).isNotEmpty) { | ||
segments = capturingProcessingMemory!.transcriptSegments; | ||
setHasTranscripts(segments.isNotEmpty); | ||
} | ||
|
||
// Notify combining | ||
if (capturingProcessingMemory?.memoryId != null) { | ||
memoryProvider?.onNewCombiningMemory(capturingProcessingMemory!); | ||
} | ||
|
||
// Update processing memory | ||
_updateProcessingMemory(); | ||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
In the _onProcessingMemoryCreated
function, there's no error handling for when result?.result == null
at line 178. This could lead to unexpected behavior if result?.result
is null and _setCapturingProcessingMemory(result?.result);
is called with a null argument. Consider adding error handling or a return statement in this case.
void _trackCapturingProcessingMemory() { | ||
if (capturingProcessingMemory == null) { | ||
return; | ||
} | ||
|
||
var pm = capturingProcessingMemory!; | ||
|
||
var delayMs = pm.capturingTo != null | ||
? pm.capturingTo!.millisecondsSinceEpoch - DateTime.now().millisecondsSinceEpoch | ||
: 2 * 60 * 1000; // 2m | ||
if (delayMs > 0) { | ||
_processingMemoryWatchTimer?.cancel(); | ||
_processingMemoryWatchTimer = Timer(Duration(milliseconds: delayMs), () async { | ||
ProcessingMemoryResponse? result = await fetchProcessingMemoryServer(id: pm.id); | ||
if (result?.result == null) { | ||
debugPrint("Can not fetch processing memory, result null"); | ||
return; | ||
} | ||
|
||
_setCapturingProcessingMemory(result?.result); | ||
if (capturingProcessingMemory == null) { | ||
// Force clean | ||
_clean(); | ||
} | ||
}); | ||
} | ||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
In the _trackCapturingProcessingMemory
function, there's no error handling for when result?.result == null
at line 212. This could lead to unexpected behavior if result?.result
is null and _setCapturingProcessingMemory(result?.result);
is called with a null argument. Consider adding error handling or a return statement in this case.
@@ -204,7 +291,7 @@ class CaptureProvider extends ChangeNotifier | |||
|
|||
// use memory provider to add memory | |||
MixpanelManager().memoryCreated(memory); | |||
memoryProvider?.addMemory(memory); | |||
memoryProvider?.upsertMemory(memory); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
# Update processing memory status | ||
processing_memory.status = ProcessingMemoryStatus.Processing | ||
processing_memories_db.update_processing_memory_status( | ||
uid, processing_memory.id, | ||
processing_memory.status, | ||
) | ||
# Message: processing memory status changed | ||
ok = await _send_message_event(ProcessingMemoryStatusChanged( | ||
event_type="processing_memory_status_changed", | ||
processing_memory_id=processing_memory.id, | ||
processing_memory_status=processing_memory.status), | ||
) | ||
if not ok: | ||
print("Can not send message event processing_memory_status_changed") |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This block of code is updating the status of the processing memory and sending a message event about the status change. This pattern is repeated later in the code as well. To avoid repetition and improve maintainability, consider extracting this into a separate function.
- # Update processing memory status
- processing_memory.status = ProcessingMemoryStatus.Processing
- processing_memories_db.update_processing_memory_status(
- uid, processing_memory.id,
- processing_memory.status,
- )
- # Message: processing memory status changed
- ok = await _send_message_event(ProcessingMemoryStatusChanged(
- event_type="processing_memory_status_changed",
- processing_memory_id=processing_memory.id,
- processing_memory_status=processing_memory.status),
- )
- if not ok:
- print("Can not send message event processing_memory_status_changed")
+ update_processing_memory_status_and_notify(
+ uid,
+ processing_memory,
+ ProcessingMemoryStatus.Processing
+ )
In the new update_processing_memory_status_and_notify
function, you can encapsulate the logic for updating the status and sending a notification.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Files selected (2)
- backend/database/processing_memories.py (2)
- backend/routers/transcribe.py (7)
Review comments generated (12)
- Review: 10
- LGTM: 2
Tips
Chat with Entelligence.AI Bot (@Entelligence.AI
)
- Reply on review comments left by this bot to ask follow-up questions. A review comment is a comment on a diff or a file.
- Invite the bot into a review comment chain by tagging
@Entelligence.AI
in a reply.
Code suggestions
- The bot may make code suggestions, but please review them carefully before committing since the line number ranges may be misaligned.
- You can edit the comment made by the bot and manually tweak the suggestion if it is slightly off.
Pausing incremental reviews
- Add
@Entelligence.AI: ignore
anywhere in the PR description to pause further reviews from the bot.
backend/routers/transcribe.py
Outdated
now_ts = time.time() | ||
if last_processing_memory_data: | ||
last_processing_memory = ProcessingMemory(**last_processing_memory_data) | ||
last_segment_end = 0 | ||
for segment in last_processing_memory.transcript_segments: | ||
last_segment_end = max(last_segment_end, segment.end) | ||
timer_segment_start = last_processing_memory.timer_segment_start if last_processing_memory.timer_segment_start else last_processing_memory.timer_start | ||
if timer_segment_start + last_segment_end + min_seconds_limit > time.time(): | ||
processing_memory = last_processing_memory | ||
if not last_processing_memory.capturing_to or last_processing_memory.capturing_to.timestamp() > now_ts: | ||
last_segment_end = 0 | ||
for segment in last_processing_memory.transcript_segments: | ||
last_segment_end = max(last_segment_end, segment.end) | ||
timer_segment_start = last_processing_memory.timer_segment_start if last_processing_memory.timer_segment_start else last_processing_memory.timer_start | ||
if timer_segment_start + last_segment_end + min_seconds_limit > now_ts: | ||
processing_memory = last_processing_memory |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
# Status, capturing to | ||
processing_memory.status = ProcessingMemoryStatus.Capturing | ||
processing_memory.capturing_to = datetime.fromtimestamp(now_ts + min_seconds_limit, timezone.utc) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
def get_processing_memories(uid: str, statuses: [str] = [], filter_ids: [str] = [], limit: int = 5): | ||
processing_memories_ref = ( | ||
db.collection('users').document(uid).collection('processing_memories') | ||
) | ||
if len(statuses) > 0: | ||
processing_memories_ref = processing_memories_ref.where(filter=FieldFilter('status', 'in', statuses)) | ||
if len(filter_ids) > 0: | ||
processing_memories_ref = processing_memories_ref.where(filter=FieldFilter('id', 'in', filter_ids)) | ||
processing_memories_ref = processing_memories_ref.order_by('created_at', direction=firestore.Query.DESCENDING) | ||
processing_memories_ref = processing_memories_ref.limit(limit) | ||
return [doc.to_dict() for doc in processing_memories_ref.stream()] |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The way you're using the where
method seems incorrect. The where
method in Firestore doesn't take a keyword argument named filter
. Instead, it takes three positional arguments: field path, operator, and value. Here's how you can fix this:
- processing_memories_ref = processing_memories_ref.where(filter=FieldFilter('status', 'in', statuses))
+ processing_memories_ref = processing_memories_ref.where('status', 'in', statuses)
- processing_memories_ref = processing_memories_ref.where(filter=FieldFilter('id', 'in', filter_ids))
+ processing_memories_ref = processing_memories_ref.where('id', 'in', filter_ids)
def update_processing_memory_segments(uid: str, id: str, segments: List[dict], capturing_to: datetime): | ||
user_ref = db.collection('users').document(uid) | ||
memory_ref = user_ref.collection('processing_memories').document(id) | ||
memory_ref.update({'transcript_segments': segments}) | ||
memory_ref.update({ | ||
'transcript_segments': segments, | ||
'capturing_to': capturing_to, | ||
}) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
def update_basic(uid: str, id: str, geolocation: dict, emotional_feedback: bool, capturing_to: datetime): | ||
user_ref = db.collection('users').document(uid) | ||
memory_ref = user_ref.collection('processing_memories').document(id) | ||
memory_ref.update({ | ||
'emotional_feedback': emotional_feedback, | ||
'geolocation':geolocation, | ||
'capturing_to':capturing_to, | ||
}) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
def update_processing_memory_status(uid: str, id: str, status: str): | ||
user_ref = db.collection('users').document(uid) | ||
memory_ref = user_ref.collection('processing_memories').document(id) | ||
memory_ref.update({ | ||
'status': status, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Files selected (1)
- backend/routers/transcribe.py (7)
Review comments generated (7)
- Review: 5
- LGTM: 2
Tips
Chat with Entelligence.AI Bot (@Entelligence.AI
)
- Reply on review comments left by this bot to ask follow-up questions. A review comment is a comment on a diff or a file.
- Invite the bot into a review comment chain by tagging
@Entelligence.AI
in a reply.
Code suggestions
- The bot may make code suggestions, but please review them carefully before committing since the line number ranges may be misaligned.
- You can edit the comment made by the bot and manually tweak the suggestion if it is slightly off.
Pausing incremental reviews
- Add
@Entelligence.AI: ignore
anywhere in the PR description to pause further reviews from the bot.
|
||
now_ts = time.time() | ||
capturing_to = datetime.fromtimestamp(now_ts + min_seconds_limit, timezone.utc) | ||
|
||
processing_memories_db.update_processing_memory_segments( | ||
uid, processing_memory.id, list(map(lambda m: m.dict(), processing_memory.transcript_segments)) | ||
uid, processing_memory.id, | ||
list(map(lambda m: m.dict(), processing_memory.transcript_segments)), | ||
capturing_to, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
now_ts = time.time() | ||
if last_processing_memory_data: | ||
last_processing_memory = ProcessingMemory(**last_processing_memory_data) | ||
last_segment_end = 0 | ||
for segment in last_processing_memory.transcript_segments: | ||
last_segment_end = max(last_segment_end, segment.end) | ||
timer_segment_start = last_processing_memory.timer_segment_start if last_processing_memory.timer_segment_start else last_processing_memory.timer_start | ||
if timer_segment_start + last_segment_end + min_seconds_limit > time.time(): | ||
if last_processing_memory.capturing_to and last_processing_memory.capturing_to.timestamp() > now_ts: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The logic for determining if processing_memory
should be set to last_processing_memory
has changed significantly. The new code checks if capturing_to
timestamp is greater than the current time. This seems reasonable, but ensure that this change doesn't break any existing functionality that relied on the old logic.
processing_memory_id=processing_memory.id, | ||
memory_id=processing_memory.memory_id, | ||
), |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
|
||
now_ts = time.time() | ||
capturing_to = datetime.fromtimestamp(now_ts + min_seconds_limit, timezone.utc) | ||
|
||
processing_memories_db.update_processing_memory_segments( | ||
uid, processing_memory.id, | ||
list(map(lambda m: m.dict(), processing_memory.transcript_segments)) | ||
list(map(lambda m: m.dict(), processing_memory.transcript_segments)), | ||
capturing_to, | ||
) | ||
|
||
# Update processing memory status | ||
processing_memory.status = ProcessingMemoryStatus.Processing | ||
processing_memories_db.update_processing_memory_status( | ||
uid, processing_memory.id, | ||
processing_memory.status, | ||
) | ||
# Message: processing memory status changed | ||
ok = await _send_message_event(ProcessingMemoryStatusChanged( | ||
event_type="processing_memory_status_changed", | ||
processing_memory_id=processing_memory.id, | ||
processing_memory_status=processing_memory.status), | ||
) | ||
if not ok: | ||
print("Can not send message event processing_memory_status_changed") |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This block of code introduces several changes including updating processing memory segments with a new capturing_to
parameter, updating processing memory status, and sending a new ProcessingMemoryStatusChanged
event. Ensure that these operations are handled correctly in their respective functions and that all consumers of the new event are updated to handle it. Also, consider error handling for the case when these operations fail.
# Update processing memory status | ||
processing_memory.status = ProcessingMemoryStatus.Done | ||
processing_memories_db.update_processing_memory_status( | ||
uid, processing_memory.id, | ||
processing_memory.status, | ||
) | ||
# Message: processing memory status changed | ||
ok = await _send_message_event(ProcessingMemoryStatusChanged( | ||
event_type="processing_memory_status_changed", | ||
processing_memory_id=processing_memory.id, | ||
memory_id=processing_memory.memory_id, | ||
processing_memory_status=processing_memory.status), | ||
) | ||
if not ok: | ||
print("Can not send message event processing_memory_status_changed") |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This block of code updates the processing memory status to Done
and sends a ProcessingMemoryStatusChanged
event. Ensure that these operations are handled correctly in their respective functions and that all consumers of the new event are updated to handle it. Also, consider error handling for the case when these operations fail.
Issue: #868
Summary by Entelligence.AI