ChatGPT解决这个技术问题 Extra ChatGPT

What is the maximum possible length of a query string?

Is it browser dependent? Also, do different web stacks have different limits on how much data they can get from the request?

you can also check this stackoverflow.com/questions/417142/…
It is only for GET requests! The maximal size of the POST requests (with or without multipart/form-data) is here unknown!

C
Community

RFC 2616 (Hypertext Transfer Protocol — HTTP/1.1) states there is no limit to the length of a query string (section 3.2.1). RFC 3986 (Uniform Resource Identifier — URI) also states there is no limit, but indicates the hostname is limited to 255 characters because of DNS limitations (section 2.3.3).

While the specifications do not specify any maximum length, practical limits are imposed by web browser and server software. Based on research which is unfortunately no longer available on its original site (it leads to a shady seeming loan site) but which can still be found at Internet Archive Of Boutell.com:

Microsoft Edge (Browser) The limit appears to be around 81578 characters. See URL Length limitation of Microsoft Edge

Chrome It stops displaying the URL after 64k characters, but can serve more than 100k characters. No further testing was done beyond that.

Firefox (Browser) After 65,536 characters, the location bar no longer displays the URL in Windows Firefox 1.5.x. However, longer URLs will work. No further testing was done after 100,000 characters.

Safari (Browser) At least 80,000 characters will work. Testing was not tried beyond that.

Opera (Browser) At least 190,000 characters will work. Stopped testing after 190,000 characters. Opera 9 for Windows continued to display a fully editable, copyable and pasteable URL in the location bar even at 190,000 characters.

Microsoft Internet Explorer (Browser) Microsoft states that the maximum length of a URL in Internet Explorer is 2,083 characters, with no more than 2,048 characters in the path portion of the URL. Attempts to use URLs longer than this produced a clear error message in Internet Explorer.

Apache (Server) Early attempts to measure the maximum URL length in web browsers bumped into a server URL length limit of approximately 4,000 characters, after which Apache produces a "413 Entity Too Large" error. The current up to date Apache build found in Red Hat Enterprise Linux 4 was used. The official Apache documentation only mentions an 8,192-byte limit on an individual field in a request.

Microsoft Internet Information Server (Server) The default limit is 16,384 characters (yes, Microsoft's web server accepts longer URLs than Microsoft's web browser). This is configurable.

Perl HTTP::Daemon (Server) Up to 8,000 bytes will work. Those constructing web application servers with Perl's HTTP::Daemon module will encounter a 16,384 byte limit on the combined size of all HTTP request headers. This does not include POST-method form data, file uploads, etc., but it does include the URL. In practice this resulted in a 413 error when a URL was significantly longer than 8,000 characters. This limitation can be easily removed. Look for all occurrences of 16x1024 in Daemon.pm and replace them with a larger value. Of course, this does increase your exposure to denial of service attacks.


Why don't you say the version number also instead of "Microsoft Internet Explorer (Browser)"?
It appears that the default IIS limit on the Query String is significantly less than 16,384 characters - quoted as 2048 here: iis.net/configreference/system.webserver/security/…
I think you made a type and the DNS limitations are discussed in section "3.2.2. Host" of RFC3986, not 2.2.3. "URI producers should use names that conform to the DNS syntax, even when use of DNS is not immediately apparent, and should limit these names to no more than 255 characters in length."
Causes java.lang.IllegalArgumentException: Request header is too large on tomcat spring boot application server.
T
TroySteven

Recommended Security and Performance Max: 2048 CHARACTERS

Although officially there is no limit specified by RFC 2616, many security protocols and recommendations state that maxQueryStrings on a server should be set to a maximum character limit of 1024. While the entire URL, including the querystring, should be set to a max of 2048 characters. This is to prevent the Slow HTTP Request DDOS/DOS attack vulnerability on a web server. This typically shows up as a vulnerability on the Qualys Web Application Scanner and other security scanners.

Please see the below example code for Windows IIS Servers with Web.config:

<system.webServer>
<security>
    <requestFiltering>
        <requestLimits maxQueryString="1024" maxUrl="2048">
           <headerLimits>
              <add header="Content-type" sizeLimit="100" />
           </headerLimits>
        </requestLimits>
     </requestFiltering>
</security>
</system.webServer>

This would also work on a server level using machine.config.

This is just for windows operating system based servers, I'm not sure if there is a similar issue on apache or other servers.

Note: Limiting query string and URL length may not completely prevent Slow HTTP Requests DDOS attack but it is one step you can take to prevent it.

Adding a reference as requested in the comments: https://www.raiseupwa.com/writing-tips/what-is-the-limit-of-query-string-in-asp-net/


And now I have a reason I can tell the backend engineers that we won't accept a list of one hundred 36 character UUIDs in the queryParams of a GET request. Thanks!
@Mordred, what is this API for - that takes in 100 UUIDs in query params? Is it a kind of Filtering UI?
@MaulikModi Yes. It was essentially a "simple" backend query of /get/records-by-id?ids=10000000001,1000000002,.... but the IDs were UUIDs of course.
@Morderd - Best solution I guess is to limit the UUIDs in the request. I think that putting the UUIDs in the query, while ugly, is the best practice. Some database engines such as ElasticSearch put the UUIDs in the body of a GET request, but that is not standardized, and some web frameworks ignore the body on a GET. I also commonly see APIs use a POST request instead to send the UUIDs, which has other downsides - GET is fundamentally different from POST - so you end up breaking some of the functionality, such as caching, that was designed for GET requests.
Is there any way to get references to some example security recommendations regarding this?
k
kdevine

Different web stacks do support different lengths of http-requests. I know from experience that the early stacks of Safari only supported 4000 characters and thus had difficulty handling ASP.net pages because of the USER-STATE. This is even for POST, so you would have to check the browser and see what the stack limit is. I think that you may reach a limit even on newer browsers. I cannot remember but one of them (IE6, I think) had a limit of 16-bit limit, 32,768 or something.


Y
Yogi Ghorecha

2048 CHARACTERS

Although officially there is no limit specified by RFC 2616, many security protocols and recommendations state that maxQueryStrings on a server should be set to a maximum character limit of 1024. While the entire URL, including the querystring, should be set to a max of 2048 characters. Blockquote A URL is considered too long if it is longer than 100 characters. An overly long URL can cause both usability and search engine: Any potential benefit you may have by including keywords will be diluted since it's such as small percentage of the total URL text.


could you please share some reference about the security protocols recommendations?