This is a pretty fundamental refactor of the way
CppComprehensionEngine works.
Previously, in order to answer queries such as "goto definition" or
"autocomplete", we would do ad-hoc logic of walking the AST,
collecting available declaration nodes, computing scopes, and so on.
This commit introduces an architectural change where each Document
builds a hashmap of symbols on creation.
With these hashmaps, it's easier to iterate over all of the available
symbols, and to answer a query such as "which symbols are defined in
this scope".
We currently only support application/x-www-form-urlencoded for
form submissions, which uses a special percent encode set when
percent encoding the body/query. However, we were not using this
percent encode set.
With the new URL implementation, we can now specify the percent encode
set to be used, allowing us to use this special percent encode set.
This is one of the fixes needed to make the Google cookie consent work.
find_and_highlight() selected +1 too many bytes.
'Select All' selected +1 too many bytes past the end of
the buffer.
Status bar 'Selected Bytes' count was off by -1 when more
than zero bytes were selected.
This percent encodes/decodes the request URI when creating or parsing
raw HTTP requests. This is necessary because AK::URL now contains
percent decoded data, meaning we have to re-encode it for creating
raw requests.
This removes URLParser, because its two exposed functions, urlencode()
and urldecode(), have been superseded by URL::percent_encode() and
URL::percent_decode(). This is in preparation for the introduction of a
new URL parser.
This replaces all occurrences of those functions with the newly
implemented functions URL::percent_encode() and URL::percent_decode().
The old functions will be removed in a further commit.
When we don't have a matching card for the lead card rather than
always preferring to play hearts we should try to get rid of our
high value cards first if no other player has hearts cards higher
than what we have.
When we're the third player in a trick and we don't have a lower value
card we would previously pick a slightly higher value card. Instead
we should pick the highest value card unless there are points in the
current trick or the lead card is spades and the higher value card
we would've picked is higher than the queen and another player still
has the queen.
The rationale is that we have to take the trick anyway so we might as
well get rid of our highest value card. If the trailing player has a
lower value card of the same type we take the trick but don't gain
any points. If they don't have a card of the same type it doesn't
matter whether we play a high value or low value card.
Previously the AI would prefer playing a lead card for which no other
player had a card with a higher value even though it also had a card
for which a higher value card was still in play.
Previously we didn't check that the selection's row index is in a valid
range before attempting to access its data via the model.
This could cause an out-of-bounds access to the model's Vector of
suggestions.
I think this should fix#7404, but I can't verify it does because
I wasn't able to reproduce it on my machine.
JPGLoader used to store component information in a HashTable, indexed
by the ID assigned by the JPEG file. This was fine for most purposes,
however after f89e8fb7 this was revealed to be a flawed implementation
which causes non-deterministic iteration over components.
This issue was previously masked by a perfect storm of int_hash being
stable for the integer values 0, 1 and 2; and AK::HashTable having just
the right amount of buckets for the components to be ordered correctly
after being hashed with int_hash. However, after f89e8fb7,
malloc_good_size was used for determining the amount of space for
allocation; this caused the ordering of the components to change, and
images started showing up with the red and blue channels reversed. The
issue was finally determined to be inconsistent ordering after randomly
changing the order of the components caused Huffman decoding to fail.
This was the result of about 10 hours of hair-pulling and repeatedly
doing full rebuilds due to bisecting between commits that touched AK.
Gunnar, I like you, but please don't make me go through this again. :^)
Credits to Andrew Kaster, bgianf, CxByte and Gunnar for the debugging
help.
The Context and Software Rasterizer now gets the array of texture units
instead of a single texture object. _Technically_, we now support some
primitive form of multi-texturing, though I'm not entirely sure how well
it will work in its current state.