I am not really sure if size of regex is your problem here. My java solution uses kind-of-sounding-similar approach, and my longest expression is ~117kB (sic!) for 17. A couple of other expressions are also in area of a couple of tens of kilobytes.
My solution does not time out even though I do no particular simplification, I use capturing groups and backtracking quantifiers. It means, my expressions are TOTALLY not optimized, it;s maybe just the order of eliminated nodes what controls the size of my expressions. But in other languages, with other regex libraries, it could be necessary to optimize the expression a bit. AFAIK Python has a limit of number of capturing groups, and maybe you could use non-giving back quantifiers to reduce backtracking of the regex?
first check that you don't use matching groups (even if you're not using python)
second... I cannot really tell (cause I solved it building my reasonning from scratch), but since you mentionned this kind of concept, I'd begin by checking "reduction" algo of that kind of thing specifically.
If you insert a letter, it implies that you have to delete and add a certain letter: (cost 2)
If you simply add or delete: (cost 1)
This is a well known approach called minimum edit distance.
I passed all tests but the one we are talking about.
e.g.
rkacypviuburk to zqdrhpviqslik, cost: 16 because you have to delete and add 8 letters
rkacypviuburk to karpscdigdvucfr, cost: 12 because you have to delete and insert 5 and also delete 2 letters
Of course C is also checking this "Low-ace straight", contradicting the instructions... edit: I mean in the sample tests, anyway. I don't get that far in real tests yet ;)
It doesn't modify the arrays, but creates a new array.
sorted will be delegeted to Arrays.sort() here
and Arrays.sort() is O(nlogn) which is more than
O(4n)
You can safely assume if it can be up to
2e6
, it can be down to-2e6
.I am not really sure if size of regex is your problem here. My java solution uses kind-of-sounding-similar approach, and my longest expression is ~117kB (sic!) for 17. A couple of other expressions are also in area of a couple of tens of kilobytes.
My solution does not time out even though I do no particular simplification, I use capturing groups and backtracking quantifiers. It means, my expressions are TOTALLY not optimized, it;s maybe just the order of eliminated nodes what controls the size of my expressions. But in other languages, with other regex libraries, it could be necessary to optimize the expression a bit. AFAIK Python has a limit of number of capturing groups, and maybe you could use non-giving back quantifiers to reduce backtracking of the regex?
Mentioned case-sensitivity in description
This comment is hidden because it contains spoiler information about the solution
If you insert a letter, it implies that you have to delete and add a certain letter: (cost 2)
If you simply add or delete: (cost 1)
This is a well known approach called minimum edit distance.
I passed all tests but the one we are talking about.
e.g.
rkacypviuburk to zqdrhpviqslik, cost: 16 because you have to delete and add 8 letters
rkacypviuburk to karpscdigdvucfr, cost: 12 because you have to delete and insert 5 and also delete 2 letters
same account to the test you are addressing!
me too. damn!
See https://stackoverflow.com/questions/1247486/list-comprehension-vs-map
in this case
map
is best practice.Of course C is also checking this "Low-ace straight", contradicting the instructions... edit: I mean in the sample tests, anyway. I don't get that far in real tests yet ;)
Oh damn... I fought
n
was the index in Fibonacci sequence.This comment is hidden because it contains spoiler information about the solution
Loading more items...