[PYTHON] 33 selections that the soshage engineer emphasizes during code review [added at any time]

Code review for peace on Saturdays and Sundays

Social games tend to be dramatically overloaded (mainly on Saturdays and Sundays) due to the concentration of user access and the accompanying increase in user data.

The problematic code may not be found, even with load testing, depending on the scenario you created. I think code review is also important in terms of dispelling such invisible anxiety.

[Stunning point] -Unlock the time bomb from exploding on Saturdays and Sundays by looking at the source ・ By sharing skills to raise the level of members, the probability of installing a time bomb that explodes on Saturdays and Sundays will decrease.

Well to summarize That's all (4 year old son talk)

Now that I have a powerful review tool called git pull request, the threshold is much lower, so I recommend it!

5 points to check

When conducting a code review, I often hear people say * "I don't know what to check" *, so I've summarized the points I keep in mind when checking.

The following 5 points should be checked with priority.

  1. Does the execution speed slow down?
  2. Is there a problem with data integrity?
  3. Is there a problem during operation?
  4. Does maintainability decrease?
  5. Is it okay for unauthorized access?

Check details

The following are the details to be checked for each category. Click here for the prerequisite environment.

By the way, the importance of the check is expressed by pictograms. : rage :: Absolutely correct : scream :: Have it fixed unless you have a good reason : confounded :: Through if there is a good reason

MySQL edition

The main check is ** "Is the index properly attached to the execution SQL?" **. I have the impression that many people are neglecting it, probably because SQL has become difficult to see due to the spread of ORM. Since it has a large effect on the execution speed, we will focus on checking.

Reference: Important points when explaining

*: rage: ** Is the index used for the execution SQL? ** ** Retrieving data without an index slows down dramatically with the amount of data. *: rage: ** If a composite index is used, where are all columns used? ** ** If only some columns are indexed, it will slow down with the amount of data. For complex SQL, explain and check if key_len is appropriate. *: rage: ** Is the unique key affixed to the table where consistency is important? ** ** Prevents data from becoming inconsistent due to code defects. *: rage: ** If you are using select for update, are you considering gap locks? ** ** If you select for update on non-existent data, deadlock due to gap lock may occur. *: scream: ** Is the order by column also indexed? ** ** If the column to be sorted is not indexed, filesort will occur and it will be slow. *: scream: ** Are unused columns indexed? (Are there any unnecessary indexes?) ** If there are many unnecessary indexes, it will take time to regenerate the index at the time of insert. *: confounded: ** Are you indexing columns with low cardinality? ** ** Columns with low cardinality (many duplicate data) are slow due to poor index efficiency. *: confounded: ** Is the covering index lost because you are getting unnecessary columns? ** ** MySQL is a cluster index, so it will be faster if the select column is covered by the index. *: confounded: ** Is the expected number of records in the table too large? ** ** If the number of data is large, the acquisition speed and ALTER TABLE execution speed will be slow. Are measures such as partitioning and sharding taken? confirm.

Redis edition

The main thing to check is ** "validity of memory size used" **. If the memory used exceeds maxmemory, there is a concern that an error will occur (due to the maxmemory-policy setting) or that the HIT rate will drop due to vibration.

*: rage: ** Is expire (TTL) set? ** ** If expire is not set, the memory will grow until it is over. *: rage: ** If you are running multiple commands, are you using pipelining? ** ** Executing multiple commands individually will slow down.

Bad example


#Loop and register one by one
for key, value in data_dict.items():
    redis_connect.set(key, value)

Use pipe lining


with redis_connect.pipeline() as pipe:
    #Loop and register in pipeline and register all at once
    for key, value in data_dict.items():
        pipe.set(key, value)
    pipe.execute()

*: scream: ** Is the assumed memory size too large? ** ** If the memory usage is too large compared to the prepared server, ask them to consider using a Hash type. *: confounded: ** Are you using it as storage (as a premise of persistence) rather than as a cache? ** ** Basically it is NG, but if it is unavoidable that you need to use a sorted set type etc., ask for a dedicated DB number. This is because if it is in the same DB number as the key used as the cache, it will be difficult to flush in an emergency. Also, check if AOF settings (not code review), generation batch, data persistence and recovery measures are considered.

Logic edition

*: rage: ** If the acquired main data is looped and processed, is there a speed problem even if the acquired data becomes a large amount? ** ** If you are executing SQL in the processing part or performing complicated calculations, it will be dramatically slowed down during operation due to reasons such as an increase in master data. Ask them to consider whether the data used in the processing can be summarized in advance. *: rage: ** Isn't there a lot of SQL executions? ** ** Even if each SQL is fast, the response will be slow if it accumulates, and it will be a load on the DB server. Is it possible to collect in the case of select? Is it possible to bulk insert in the case of insert? Have them consider it. *: rage: ** Is the important part logged? ** ** Without logs, you will not be able to investigate when a user inquires. *: rage: ** Is it okay for multiple users to access the same data at the same time? ** ** For example, when multiple users may update the same record, such as friend approval processing, a deadlock may occur unless exclusive processing or adjustment of the update order is performed. *: scream: ** Is the wheel recreated? ・ Is it DRY? ** ** If the number of similar processes increases, problems will occur due to increased man-hours and omission of corrections when specification changes occur. *: scream: ** Is there a magic number? ** ** If there is a magic number, the code can only be read by the person who memorizes the specifications, and if the value changes, correction omissions will occur.

Bad example


if mission_type == 1331:
    #What kind of mission is the process? ??

Easy to understand with enum etc.


if mission_type == MissionType.PlayerLevel:
    #Turns out to be a player-level mission

*: confounded: ** Is it refactored? ** ** One of the most important refactorings is the Extract Method. If a lot of code is written in one method, the nesting tends to be deep, and it becomes difficult for members to understand what the process is from where to where when reading. *: confounded: ** Negation Is there a negative branch of the boolean variable? ** ** Which is the inside of the if sentence after all? Even the person himself is confused, so the possibility of introducing a defect increases.

Extreme example


if not_equip != False:
    #When do you pass here? ??

Code edition

*: rage: ** Are there any misspellings? ** ** Needless to say. *: rage: ** Is the method name appropriate? ** ** If you also update while saying get_xxx, it will be confusing when members read the code. *: confounded: ** Do you follow the code conventions? ** ** The importance varies depending on the site, but readability will improve if the rules are followed. In the case of Python, there is a common convention pep, so it is OK if it complies. *: confounded: ** Are double-byte characters solid? ** ** If double-byte characters that are visible to the user are written solidly in the code, it will be necessary to correct the code when supporting multiple languages. I think most frameworks have i18n support, so I'll ask you to use that. See also: Django Internationalization However, it will take additional man-hours, so if there is no need for internationalization in the future, we will pass it through.

Transaction

*: rage: ** Is the transaction range appropriate? Is it bound to the extent that it should be committed / rolled back? ** ** If the range is incorrect, the data will not be consistent if an exception occurs. If you have separate DBs, be aware that you are also using XA transactions. *: rage: ** If MySQL and Redis updates are mixed in the same request, do you consider when an exception occurs? ** ** For example, if you update Redis first and an exception occurs when updating the DB, Redis will not be rolled back, which will break the consistency with the DB.

API edition

*: rage: ** If the parameter is an array, is it okay to increase the number illegally? ** ** For example, in the case of a specification that receives the target ID in an array in a batch of presents, if the number of arrays is not set to an upper limit, the server load may increase due to an attack that has been tampered with a large amount of data by a malicious user. There is sex. *: rage: ** Is it okay for the same user to send the same request multiple times in a row? ** ** When a request that should have been made once is sent twice in a row, data integrity will be lost if exclusive control is not performed. *: rage: ** Are there any measures taken when an impossible parameter is sent? ** ** If the API is executed by a malicious user by tampering with the parameters, the data integrity will be broken unless the consistency check is performed.

Comment

*: rage: ** Is the comment equal to processing? ** ** Members can misread the code if the comment is incorrect or if the comment is forgotten when the process is modified. *: rage: ** Do the corresponding comments later have the TODO: and FIXME: keywords? ** ** Be sure to add a keyword so that you can grep later, as it is often forgotten and buried.

Recommended Posts

33 selections that the soshage engineer emphasizes during code review [added at any time]
Call the python debugger at any time
Grep so that grep does not appear at the time of grep