Skip to content

Commit

Permalink
fix: large memory consumption while parsing corrupted PE.
Browse files Browse the repository at this point in the history
After #2119 `yara` is consuming a large amount of memory while parsing 9bddb45c44d9c25a4f97ef800cb110de5e6a15349bac05d389c8bda37902f25a. That's because after the change it doesn't limit the total number of imported functions, only the total number of parsing attempts, but the count is reset with each import entry. This file has a large number of entries and a large number of functions per entry, the total number of functions is very high.

It turns out that we must limit both the total number of correctly parsed functions (for cases like this one), and the total number of parsing attempts (for cases like the one #2119 was aiming to solve).
  • Loading branch information
plusvic committed Nov 28, 2024
1 parent b61ce3c commit 04e8abe
Showing 1 changed file with 4 additions and 2 deletions.
6 changes: 4 additions & 2 deletions libyara/modules/pe/pe.c
Original file line number Diff line number Diff line change
Expand Up @@ -849,7 +849,8 @@ static IMPORT_FUNCTION* pe_parse_import_descriptor(

while (struct_fits_in_pe(pe, thunks64, IMAGE_THUNK_DATA64) &&
yr_le64toh(thunks64->u1.Ordinal) != 0 &&
parsed_imports < MAX_PE_IMPORTS)
parsed_imports < MAX_PE_IMPORTS &&
*num_function_imports < MAX_PE_IMPORTS)
{
char* name = NULL;
uint16_t ordinal = 0;
Expand Down Expand Up @@ -939,7 +940,8 @@ static IMPORT_FUNCTION* pe_parse_import_descriptor(

while (struct_fits_in_pe(pe, thunks32, IMAGE_THUNK_DATA32) &&
yr_le32toh(thunks32->u1.Ordinal) != 0 &&
parsed_imports < MAX_PE_IMPORTS)
parsed_imports < MAX_PE_IMPORTS &&
*num_function_imports < MAX_PE_IMPORTS)
{
char* name = NULL;
uint16_t ordinal = 0;
Expand Down

0 comments on commit 04e8abe

Please sign in to comment.