0 votes
by (130 points)

We have a problem with the latest version of Rebex Https Lib.(R5.5 for .NET CF). It seems to leak memory slowly in the long time test when using HttpRequest/HttpResponse( Approximately memory drops 1.1~1.4MB/day ). If we use the standard HttpWebRequest/HttpWebResponse, it seems that there is no such problem.

Platform: ARM(TI:AM335x, NXP:i.MX6)
OS: Windows CE7 with .NET CF 3.5
Condition: Post a https request every minute.
Current test result: Memory drops 6.3MB during 130hr

Rebex Http Setting:

        private void InitHttpRequest()
        HttpRequestCreator creator = new HttpRequestCreator();

        creator.Settings.SslAllowedVersions = TlsVersion.TLS12;

        creator.Settings.SslAllowedSuites = TlsCipherSuite.Fast;


I found an old question.

I would like to confirm whether it is a similar problem.
And I'm trying to disable HTTP session cache for further test.

Applies to: Rebex HTTPS

2 Answers

0 votes
by (70.4k points)

We fixed all known memory issues in the version 2019R3.7.

Can you please try to measure memory performance using this version?
Please let us know, whether the memory leak is present in the 2019R3.7 version as well.

by (130 points)

I tried the latest version R5.5 +  disable HTTP session cache, but it still leaks memory. (Memory drops 6.5MB during 80hr)

I used version 2019R3.7 but the memory leak is present in this version. (Memory drops 5MB during 60hr)
by (70.4k points)
Thank you for the information.

I forgot to mention that there is known memory leak in WinCE7 itself. Windows Crypto API repeatedly commits, uses and then leaks 65 535 bytes. This is not the first known memory leak - please look at the page http://kbupdate.info/windows-embedded-compact-7-fix.php and search for the string „memory leak“.

Can you please send your measuring project to support@rebex.net, so we can reproduce the issue on our side?
We can than identify the cause of the memory leak and tell you more.
by (130 points)
edited by

Our latest test results, the memory leak problem seems to be related to cipher suites and TLS reconnection.

Our setting:
creator.Settings.SslAllowedVersions = TlsVersion.TLS12;
creator.Settings.SslAllowedSuites = RSA_xxx or DHE_RSA_xxx ;
creator.Settings.HttpSessionCacheEnabled = false;

   * NG cipher suites:

   *OK cipher suites:

However,  the performance of these DHE key exchange cipher suites are too bad which are unacceptable.

And we will provide a simplified test program for the memory leak problem as soon as possible.

Another question, according to our test

creator.Settings.SslAllowedVersions = TlsVersion.TLS12
creator.Settings.SslAllowedSuites = TlsCipherSuite.Fast;

These setting do not support cipher suites ECDHE_xxx series on WinCE7 ?

What setting should we do to use ECDHE_xxx series on WinCE7?
by (70.4k points)
That's good news. It seems the ECDHE_xxx ciphers could resolve the problem. However, test program to reproduce the issue with RSA_ would be very helpful, so we know what actually is causing the issue.

The ECDHE_xxx ciphers are available on devices with its native support only. For other devices, you can use ECC plugins. Plugins are based on open source implementations with various licenses, so please check, whether it is usable for you.

You can download ECC plugins from https://www.rebex.net/kb/simple-elliptic-curve-libraries/

More about ECC plugins at https://www.rebex.net/kb/elliptic-curve-plugins/
by (130 points)

I have sent you an e-mail with the attached test program.

Could you reproduce the memory leak issue on WinCE7?
by (70.4k points)
Thank you for the test program. I am currently working on it.
It seems that there is 64KB mem leak, which occurs from time to time (like once in 100 requests) on one of our test devices.
I am still in the middle of investigation. I will keep you informed about my findings.
by (130 points)

Do you have further test results about this issue?
by (70.4k points)
Hello, I have finished the memory measures today.
I have mailed you my findings.
by (70.4k points)
Do you have further test results using the updated test project, which I have sent you by email on 13th December?
Did you try to call GC.Collect() + GC.WaitForPendingFinalizers() in your application?
by (130 points)
We have tried to call  GC.Collect() + GC.WaitForPendingFinalizers() in our actual application but it still memory leaks slowly. We are doing further tests and modifying our test program to match the actual application.
0 votes
by (70.4k points)
edited by

I have run the test project in couple more scenarios to ensure we did not miss anything. I measured 10.000 HTTPS requests. In all cases, the memory is stable.

I measured 3 values:
1. Allocated memory reported by Garbage Collector (informative).
2. Available virtual memory.
3. Available physical memory on device (informative).

I measured 4 scenarios:
1. System HTTPS - for reference.
2. TLS session cache enabled, HTTP session cache disabled - new connection is established for each request, but TLS sessions are resumed, causing minimal TLS overhead and certificate validation (measures the most typical scenario).
3. TLS session cache disabled, HTTP session cache disabled - new connection with complete TLS negotiation and certificate validation is established for each request (measures leaks in whole process).
4. TLS session cache enabled, HTTP session cache enabled - connections are reused for subsequent requests (measures leaks in HTTP communication).

The measured data with charts is available to download here.

The charts:

Notes to charts:
- The system HTTPS uses TLS 1.0 and it probably uses all kinds of caching (TLS, HTTP, certificate validation). It is included not for comparing efficiency, but to compare memory profile.
- The 4th chart (all caching enabled) reflects situation, where every 100th request is forced to establish new connection with full TLS negotiation (server closed the connection after reusing it for 100 HTTPS requests).
- The measured physical memory is informative, since it is affected by other processes on the device (not only by the test application).

- We were not able to identify any further memory leaks in the current version.
- The 64KB memory leaks spotted in early stage of measures are present in system HTTPS measures as well. The charts shows that they occur "frequently" at the beginning of the measures, but very rarely lately in the process.
- I will run the test app for more than 24h to see how often the 64KB leaks occur. Since you reported average leak of 1MB/day it can be caused by 16 leaks of 64KB.

Just for reference, I am including memory profile from our measures before version 2019R3.7, which identified (currently fixed) memory leak:

Reference chart