This time I was able to do my best up to D. I usually don't have the ability to concentrate at all, but I may be able to concentrate more and more during the contest. After that, I think that if you add a little more typical power, the bottom will be stable and you will be able to increase the rate again. I want to go blue in July, but I think it depends on my hard work and mentality.
As it is
A.py
a=int(input())
print(a*(1+a+a**2))
We will check if they are equal. The boolean value is 0,1, so you can add it as it is. I think you can write it in a simpler way by using itertools or map.
B.py
s=input()
t=input()
ans=0
for i in range(len(s)):
ans+=(s[i]!=t[i])
print(ans)
The expectation turned into conviction. For the C problem, I try to put the problems that are indispensable for doing the future competition pro.
I like the recent C problem because it is a review because there are some problems that I tend to forget. Of course, it makes me feel uncomfortable if I make a bug.
With a greedy policy, it's a waste if there's a book at the bottom that you'll finish reading right away, so you'll find it impossible. Here, there are two, ** desk A and desk B, so I feel like trying to fix one variable **. Therefore, suppose it took $ SA $ to read the $ i $ th book from the top of desk A. Then, for the rest, select the book on desk B that can be read within $ K-SA $ minutes from the top.
Here, in order to get the information that appears repeatedly ** how many books can be read in minutes ** for $ O (1) $, ** the time it takes to read each book received by input is required. The cumulative sum ** should be set, so I did it in the previous calculation. (Also, at this time, it will be easier to implement if you add 0 to the beginning as an element ** read 0 books from the top **.)
Also, since the time it takes to read a book is always positive, if you take the cumulative sum, it will be a monotonous increasing sequence, so you can select the book that can be read within $ K-SA $ minutes from the top of the books on desk B. Can be done with a binary search. This binary search finds the index ** of the largest element below ** $ K-SA $, but ** (index of the smallest element larger than $ K-SA $) -1 ** Since they are equivalent, it should be (index obtained by bisect_right
) -1. Furthermore, when $ K-SA <0 $, the required index is -1, so it is necessary to exclude it, so be careful.
From the above, it can be implemented with $ O (N \ times \ log {M}) $.
C.py
from itertools import accumulate
from bisect import bisect_left,bisect_right
n,m,k=map(int,input().split())
a=[0]+list(accumulate(map(int,input().split())))
b=[0]+list(accumulate(map(int,input().split())))
ans=[0]
for i in range(n+1):
c=bisect_right(b,k-a[i])-1
if c!=-1:
ans.append(c+i)
print(max(ans))
I'm reflecting on it because I started thinking in a slightly strange direction.
First, if you enumerate all the divisors, $ O (N \ sqrt {N}) $ is clearly not in time, and if you enumerate the prime numbers with the Eratosthenes sieve, $ O (N \ log {N}) $ is likely to be in time for C ++. I thought, but I avoided it (because I am not good at implementing it).
I think that there are not many options if you cut these two. What you should think of here is that ** divisors are easy to consider when considering their multiples **. Therefore, we conducted experiments ** in ascending order. Then you can grasp the following properties.
$ i $ is a divisor candidate, and the one on the right is a number candidate that has $ i $ as a divisor. Also, to find the sum of $ K \ times f (K) $ from $ K = 1 $ to $ N $, consider the sum of the numbers that appear on the right side of the above figure, and for each $ i $ on the right side. Is an arithmetic progression and its sum is easily calculated by $ O (1) $, so it can be calculated for all $ i $ and implemented by $ O (n) $.
✳︎ The sum of arithmetic progressions is calculated by (first term + last term) ((last term-first term +1) / tolerance) / 2.
D.py
n=int(input())
ans=0
for i in range(1,n+1):
x,y=i,(n//i)*i
ans+=((n//i)*(x+y)//2)
print(ans)
I'm wondering if I'll cover the inclusion principle in another article and write it in this article. I've already solved the problem and will update it later today.
I couldn't because I knew only the name of Nim. I think it's easy if you know Nim (I'll solve it on Monday).
Recommended Posts