Previously, I implemented a trial implementation of Relay style in the following article.
-Paging implementation by Relay style in GraphQL (Part 1: Backend) -Paging implementation by Relay style in GraphQL (Part 2: Front end)
However, the combination of the requirements of "moving to the previous / next page" and "sorting in ascending / descending order by any item" created more complicated implementation than I had imagined, and was quite indigestive. This time, I tried to simplify the implementation on the backend side (from the previous time) by putting an architectural constraint that assumes that the RDB to be used is PostgreSQL.
In addition, we will not explain these individual languages and libraries.
Same as Last Front End Article.
It has the following functions on the page that lists some information (this time Customer
(customer)).
--String search filter (partial match search) --Transition between previous page and next page --Sort in ascending or descending order by list display element --Change the number of items displayed in the list
Simply get all the items when the initial page is displayed, and search each time (as much as necessary for one page) instead of the previous page and next page transition on memory. When the following is executed, the display returns to the first page even if the paging is in progress (for example, the second page is being displayed).
--Sort in ascending or descending order by list display element --Change the number of items displayed in the list
1st page 2nd page 3rd page
1st page
2nd page
--The 12th "Reconsideration of paging implementation by Relay style in GraphQL (Window function use version)" --The 11th "Correspondence to N + 1 problem using Dataloaders" --The 10th "GraphQL (gqlgen) Error Handling" ――The 9th "Certification authorization case in GraphQL (Auth0 RBAC tailoring)" --The 8th "GraphQL / Nuxt.js (TypeScript / Vuetify / Apollo) / Golang (gqlgen) / Google Cloud Storage combination video file upload implementation example ” ――The 7th "Paging implementation by Relay style in GraphQL (Part 2: Front end)" --The 6th "Paging implementation by Relay style in GraphQL (Part 1: Backend)" --The 5th "Start GraphQL server with DB connection (by Golang) on local machine Docker container" --The 4th "GraphQL Schema First for Front End with graphql-codegen" --Third "Create GraphQL server with go + gqlgen (DB connection using GORM)" --The 2nd "NuxtJS (with Apollo) TypeScript support" --Part 1 "Create a GraphQL service with a combination of" nuxtjs / apollo "on the front end and" go + gqlgen "on the back end"
$ cat /etc/os-release
NAME="Ubuntu"
VERSION="20.04.1 LTS (Focal Fossa)"
$ go version
go version go1.15.2 linux/amd64
v0.13.0
IDE - Goland
GoLand 2020.2.3
Build #GO-202.7319.61, built on September 16, 2020
https://github.com/sky0621/study-graphql/tree/v0.10.0/try01
DB Run PostgreSQL v13 with Docker Compse. (Since it is used only locally, write passwords etc. in solid)
docker-compose.yml
version: '3'
services:
db:
restart: always
image: postgres:13-alpine
container_name: study-graphql-postgres-container
ports:
- "25432:5432"
environment:
- DATABASE_HOST=localhost
- POSTGRES_DB=study-graphql-local-db
- POSTGRES_USER=postgres
- POSTGRES_PASSWORD=yuckyjuice
- PGPASSWORD=yuckyjuice
volumes:
- ./local/data:/docker-entrypoint-initdb.d/
Create a " customer
"table in the above DB.
CREATE TABLE customer (
id bigserial NOT NULL,
name varchar(64) NOT NULL,
age int NOT NULL,
PRIMARY KEY (id)
);
The records in the customer
table are below.
ID
together with the initial display of the screen.
Actually, it's not so complicated if it's only the Relay part, but this time it's a combination of "character string search filter" and "ascending / descending sort by each element", so the definition is a little complicated.
$ tree schema/
schema/
├── connection.graphql
├── customer.graphql
├── order.graphql
├── pagination.graphql
├── schema.graphql
└── text_filter.graphql
■schema.graphql
# Global Object Identification ...Make all data unique with a common ID
interface Node {
id: ID!
}
schema {
query: Query
}
type Query {
node(id: ID!): Node
}
■customer.graphql
extend type Query {
"Get TODO list by Relay-compliant paging-compatible search"
customerConnection(
"Paging conditions"
pageCondition: PageCondition
"Sorting conditions"
edgeOrder: EdgeOrder
"String filter condition"
filterWord: TextFilterCondition
): CustomerConnection
}
This is the query called from the front end this time. A description of each element will be described later. It has the following fields according to the requirements.
--"Paging condition
" including the condition of transition between the previous page and the next page
--"'Sort condition'" including ascending order of each element and sort condition described later --"
String filter condition` "for string search filter (partial match)
The return value of the query is in the Relay-compliant Connection format (also described later).
CustomerConnection
"For returning results with paging"
type CustomerConnection implements Connection {
"Page information"
pageInfo: PageInfo!
"Search result list (* including cursor information)"
edges: [CustomerEdge!]!
"Total number of search results"
totalCount: Int64!
}
For storing the execution result of the customerConnection query. Compliant with Relay Specifications (maybe not at that level, reference level).
The Connection
interface (described later) is implemented so that it can be handled universally.
Page information (PageInfo
) will be described later.
CustomerEdge
"Search results (* including cursor information)"
type CustomerEdge implements Edge {
node: Customer!
cursor: Cursor!
}
The Edge
interface (described later) is implemented so that it can be used for general purposes.
Shows the search results for one item. It has information called "cursor
" for data identification.
The Cursor
type will be described later.
Customer
type Customer implements Node {
"ID"
id: ID!
"name"
name: String!
"age"
age: Int!
}
Represents one customer.
■pagination.graphql PageCondition A type that represents the "paging condition" to be passed to the query.
"Paging conditions"
input PageCondition {
"Previous page transition condition"
backward: BackwardPagination
"Next page transition condition"
forward: ForwardPagination
"Current page number (as of the time before this paging)"
nowPageNo: Int64!
"Number of items displayed per page"
initialLimit: Int64!
}
BackwardPagination The paging condition passed when transitioning to the "previous page".
"Previous page transition condition"
input BackwardPagination {
"Number of acquisitions"
last: Int64!
"Acquisition target identification cursor (* Records before this cursor when transitioning to the previous page are acquisition targets)"
before: Cursor!
}
ForwardPagination The paging condition passed during the "next page" transition.
"Next page transition condition"
input ForwardPagination {
"Number of acquisitions"
first: Int64!
"Acquisition target identification cursor (* Records behind this cursor are the acquisition targets when transitioning to the next page)"
after: Cursor!
}
Cursor
In the cursor, store the URL-encoded value after combining the ROW_NUMBER
that is assigned when searching the DB with the table name.
See below.
"Cursor (identifier that uniquely identifies one record)"
scalar Cursor
■order.graphql EdgeOrder A type that represents the "sorting condition" passed to the query.
"Sorting conditions"
input EdgeOrder {
"Sort key item"
key: OrderKey!
"Sort direction"
direction: OrderDirection!
}
OrderKey
"""
Sort key
[Consideration process]
I wanted to make it a general-purpose structure and type-safe, so I tried to implement it with input or enum for each function after defining it with interface.
However, I gave up because input was a specification that could not implement interface.
I wish the enum had an inheritance feature, but it didn't.
CustomerOrderKey and (if more) sort keys for other features in union|I also thought about how to connect with
I also gave up because it was a specification that union could not be included as an element in input.
However, I wanted to provide sorting as a common mechanism, and as a result, I enumerated the enum fields for each function in a common input.
"""
input OrderKey {
"Sort key for user list"
customerOrderKey: CustomerOrderKey
}
OrderDirection
"Sorting direction"
enum OrderDirection {
"ascending order"
ASC
"descending order"
DESC
}
■text_filter.graphql TextFilterCondition A type that represents the "string filter condition" to be passed to the query.
"String filter condition"
input TextFilterCondition {
"Filter string"
filterWord: String!
"Matching pattern"
matchingPattern: MatchingPattern!
}
MatchingPattern
"Matching pattern type (* Add "start match" or "end match" depending on requirements)"
enum MatchingPattern {
"Partial Match"
PARTIAL_MATCH
"Perfect matching"
EXACT_MATCH
}
■connection.graphql
scalar Int64
"For returning results with paging"
interface Connection {
"Page information"
pageInfo: PageInfo!
"Result list (* including cursor information)"
edges: [Edge!]!
"Total number of search results"
totalCount: Int64!
}
"Page information"
type PageInfo {
"With or without next page"
hasNextPage: Boolean!
"With or without previous page"
hasPreviousPage: Boolean!
"1st record on the page"
startCursor: Cursor!
"Last record on the page"
endCursor: Cursor!
}
"Search result list (* including cursor information)"
interface Edge {
"Substitution is possible if the type implements the Node interface"
node: Node!
cursor: Cursor!
}
This is not the subject of this time, so just present the source.
server.go
package main
import (
"log"
"net/http"
"time"
"github.com/99designs/gqlgen/graphql/handler"
"github.com/99designs/gqlgen/graphql/playground"
"github.com/go-chi/chi"
"github.com/jmoiron/sqlx"
_ "github.com/lib/pq"
"github.com/rs/cors"
"github.com/sky0621/study-graphql/try01/src/backend/graph"
"github.com/sky0621/study-graphql/try01/src/backend/graph/generated"
"github.com/volatiletech/sqlboiler/v4/boil"
)
func main() {
// MEMO:Since it is used only locally, it is solid
dsn := "host=localhost port=25432 dbname=study-graphql-local-db user=postgres password=yuckyjuice sslmode=disable"
db, err := sqlx.Connect("postgres", dsn)
if err != nil {
log.Fatal(err)
}
boil.DebugMode = true
var loc *time.Location
loc, err = time.LoadLocation("Asia/Tokyo")
if err != nil {
log.Fatal(err)
}
boil.SetLocation(loc)
r := chi.NewRouter()
r.Use(corsHandlerFunc())
r.Handle("/", playground.Handler("GraphQL playground", "/query"))
r.Handle("/query",
handler.NewDefaultServer(
generated.NewExecutableSchema(
generated.Config{
Resolvers: &graph.Resolver{
DB: db,
},
},
),
),
)
if err := http.ListenAndServe(":8080", r); err != nil {
panic(err)
}
}
func corsHandlerFunc() func(h http.Handler) http.Handler {
return cors.New(cors.Options{
AllowedOrigins: []string{"*"},
AllowedMethods: []string{"GET", "POST"},
AllowedHeaders: []string{"Accept", "Authorization", "Content-Type", "X-CSRF-Token"},
ExposedHeaders: []string{"Link"},
AllowCredentials: true,
MaxAge: 300, // Maximum value not ignored by any of major browsers
}).Handler
}
This is the source that bears the subject of this time. Roughly speaking,
go:graph/customer.resolvers.go(Excerpt)
func (r *queryResolver) CustomerConnection(ctx context.Context, pageCondition *model.PageCondition, edgeOrder *model.EdgeOrder, filterWord *model.TextFilterCondition) (*model.CustomerConnection, error) {
/*
*For holding various elements required for SQL construction
*/
params := searchParam{
//Table name of information acquisition destination
tableName: boiled.TableNames.Customer,
//The default sort order is ID descending order
orderKey: boiled.CustomerColumns.ID,
orderDirection: model.OrderDirectionDesc.String(),
}
/*
*Search string filter settings
* TODO:If you want to apply a filter to multiple columns, connect with AND here or buildSearchQueryMod()Need to be considered to expand
*/
filter := filterWord.MatchString()
if filter != "" {
params.baseCondition = fmt.Sprintf("%s LIKE '%s'", boiled.CustomerColumns.Name, filter)
}
/*
*Paging settings
*/
if pageCondition.IsInitialPageView() {
//Initial page view without paging
params.rowNumFrom = 1
params.rowNumTo = pageCondition.InitialLimit
} else {
//Transition instruction to the previous page
if pageCondition.Backward != nil {
key, err := decodeCustomerCursor(pageCondition.Backward.Before)
if err != nil {
log.Print(err)
return nil, err
}
params.rowNumFrom = key - pageCondition.Backward.Last
params.rowNumTo = key - 1
}
//Transition instruction to the next page
if pageCondition.Forward != nil {
key, err := decodeCustomerCursor(pageCondition.Forward.After)
if err != nil {
log.Print(err)
return nil, err
}
params.rowNumFrom = key + 1
params.rowNumTo = key + pageCondition.Forward.First
}
}
/*
*Specifying the order
*/
if edgeOrder.CustomerOrderKeyExists() {
params.orderKey = edgeOrder.Key.CustomerOrderKey.String()
params.orderDirection = edgeOrder.Direction.String()
}
/*
*Search execution
*/
var records []*CustomerWithRowNum
if err := boiled.Customers(buildSearchQueryMod(params)).Bind(ctx, r.DB, &records); err != nil {
log.Print(err)
return nil, err
}
/*
*Necessary for determining the existence of the next page and the previous page after paging
*For holding the number of results after applying the search string filter
*/
var totalCount int64 = 0
{
var err error
if filter == "" {
totalCount, err = boiled.Customers().Count(ctx, r.DB)
} else {
totalCount, err = boiled.Customers(qm.Where(boiled.CustomerColumns.Name+" LIKE ?",
filterWord.MatchString())).Count(ctx, r.DB)
}
if err != nil {
log.Print(err)
return nil, err
}
}
/*
*Relay return format
*/
result := &model.CustomerConnection{
TotalCount: totalCount,
}
/*
*Convert search results to Edge slice format
*/
var edges []*model.CustomerEdge
for _, record := range records {
edges = append(edges, &model.CustomerEdge{
Node: &model.Customer{
ID: strconv.Itoa(int(record.ID)),
Name: record.Name,
Age: record.Age,
},
Cursor: createCursor("customer", record.RowNum),
})
}
result.Edges = edges
//Calculate the total number of pages by this search from the total number of search results and the number of items displayed per page
totalPage := pageCondition.TotalPage(totalCount)
/*
*Information required for screen display and next paging on the client side
*/
pageInfo := &model.PageInfo{
HasNextPage: (totalPage - pageCondition.MoveToPageNo()) >= 1, //Is there still a page ahead after the transition?
HasPreviousPage: pageCondition.MoveToPageNo() > 1, //Is there still a previous page after the transition?
}
if len(edges) > 0 {
pageInfo.StartCursor = edges[0].Cursor
pageInfo.EndCursor = edges[len(edges)-1].Cursor
}
result.PageInfo = pageInfo
return result, nil
}
params := searchParam{
//Table name of information acquisition destination
tableName: boiled.TableNames.Customer,
//The default sort order is ID descending order
orderKey: boiled.CustomerColumns.ID,
orderDirection: model.OrderDirectionDesc.String(),
}
The above entity is below. Basically, it overwrites with the conditions passed from GraphQL client, but if it is not specified, the default is required at the beginning.
(Even in the function that constructs the SQL statement by passing searchParam
, it is actually initialized)
search.go
type searchParam struct {
orderKey string
orderDirection string
tableName string
baseCondition string
rowNumFrom int64
rowNumTo int64
}
/*
*Search string filter settings
* TODO:If you want to apply a filter to multiple columns, connect with AND here or buildSearchQueryMod()Need to be considered to expand
*/
filter := filterWord.MatchString()
if filter != "" {
params.baseCondition = fmt.Sprintf("%s LIKE '%s'", boiled.CustomerColumns.Name, filter)
}
The character string for search is constructed by the following function.
model/expansion.go
func (c *TextFilterCondition) MatchString() string {
if c == nil {
return ""
}
if c.FilterWord == "" {
return ""
}
matchStr := "%" + c.FilterWord + "%"
if c.MatchingPattern == MatchingPatternExactMatch {
matchStr = c.FilterWord
}
return matchStr
}
For the matching pattern, only exact match and partial match are prepared for the time being, but you can increase the prefix match and suffix match as needed.
model/models_gen.go
//Matching pattern type (* Add "start match" or "end match" depending on requirements)
type MatchingPattern string
const (
//Partial Match
MatchingPatternPartialMatch MatchingPattern = "PARTIAL_MATCH"
//Perfect matching
MatchingPatternExactMatch MatchingPattern = "EXACT_MATCH"
)
When the initial page is displayed (in short, assuming the timing when the screen is opened for the first time, the sorting items are changed, or the number of items displayed in the list is changed), the following.
if pageCondition.IsInitialPageView() {
//Initial page view without paging
params.rowNumFrom = 1
params.rowNumTo = pageCondition.InitialLimit
} else {
〜〜〜
}
Whether it is the initial page or not is judged as follows.
model/expansion.go
func (c *PageCondition) IsInitialPageView() bool {
if c == nil {
return true
}
return c.Backward == nil && c.Forward == nil
}
Next, the flow line at the time of transition to the previous page or the next page is as follows.
〜〜〜
} else {
//Transition instruction to the previous page
if pageCondition.Backward != nil {
key, err := decodeCustomerCursor(pageCondition.Backward.Before)
if err != nil {
log.Print(err)
return nil, err
}
params.rowNumFrom = key - pageCondition.Backward.Last
params.rowNumTo = key - 1
}
//Transition instruction to the next page
if pageCondition.Forward != nil {
key, err := decodeCustomerCursor(pageCondition.Forward.After)
if err != nil {
log.Print(err)
return nil, err
}
params.rowNumFrom = key + 1
params.rowNumTo = key + pageCondition.Forward.First
}
}
The important thing here is decoding the cursor.
The cursor is URL-encoded in the form " customer
+ ROW_NUMBER ".
ROW_NUMBER is a premise that a serial number is assigned to the result regardless of the search content (whether it is narrowed down or sorted in ascending or descending order).
decodeCustomerCursor(~~~~)
Decode as follows.
graph/customer.go
func decodeCustomerCursor(cursor string) (int64, error) {
modelName, key, err := decodeCursor(cursor)
if err != nil {
return 0, err
}
if modelName != "customer" {
return 0, errors.New("not customer")
}
return key, nil
}
The definition of decodeCursor (~~~~)
is as follows.
graph/util.go
const cursorSeps = "#####"
func decodeCursor(cursor string) (string, int64, error) {
byteArray, err := base64.RawURLEncoding.DecodeString(cursor)
if err != nil {
return "", 0, err
}
elements := strings.SplitN(string(byteArray), cursorSeps, 2)
key, err := strconv.Atoi(elements[1])
if err != nil {
return "", 0, err
}
return elements[0], int64(key), nil
}
See the image below for how to get the records for the page to be displayed this time with the above logic.
Currently, the state is as follows.
・ The number of items displayed per page is 5
・ Arranged in descending order of ID
・ The state where the second page is displayed
1st page, 2nd page, 3rd page
ROW_NUMBER: [1, 2, 3, 4, 5], [6, 7, 8, 9, 10], [11, 12, 13, 14, 15]
■ In case of instruction to transition to "previous page"
I want records 1 to 5 on the first page.
pageCondition.Backward.ROW decoded Before_In NUMBER (indicates the first record on the second page)[6]Is included.
Also, pageCondition.Backward.The number of items displayed per page in Last[5 cases]Is included.
Therefore, the range to be acquired is determined by the following calculation.
From:6 - 5 = 1
To :6 - 1 = 5
■ In case of instruction to transition to "next page"
I want records 11 to 15 on the third page.
pageCondition.Forward.ROW decoded After_In NUMBER (indicates the last record on the second page)[10]Is included.
Also, pageCondition.Forward.Number of items displayed per page in First[5 cases]Is included.
Therefore, the range to be acquired is determined by the following calculation.
From:10 + 1 = 11
To :10 + 5 = 15
if edgeOrder.CustomerOrderKeyExists() {
params.orderKey = edgeOrder.Key.CustomerOrderKey.String()
params.orderDirection = edgeOrder.Direction.String()
}
The definition of CustomerOrderKeyExists ()
is as follows.
model/expansion.go
func (o *EdgeOrder) CustomerOrderKeyExists() bool {
if o == nil {
return false
}
if o.Key == nil {
return false
}
if o.Key.CustomerOrderKey == nil {
return false
}
return o.Key.CustomerOrderKey.IsValid()
}
The key candidates for sorting related to "customer" information are as follows.
model/modege_gen.go
type CustomerOrderKey string
const (
// ID
CustomerOrderKeyID CustomerOrderKey = "ID"
//username
CustomerOrderKeyName CustomerOrderKey = "NAME"
//age
CustomerOrderKeyAge CustomerOrderKey = "AGE"
)
var records []*CustomerWithRowNum
if err := boiled.Customers(buildSearchQueryMod(params)).Bind(ctx, r.DB, &records); err != nil {
log.Print(err)
return nil, err
}
First, the structure of CustomerWithRowNum
, which is the slice type as records
, is as follows.
boiled.Customer
is a structure automatically generated by SQL Boiler from the DB table definition.
Wrap this and keep the ROW_NUMBER you receive as row_num
in your SQL statement as RowNum
.
By doing this, when receiving the execution result of the SQL statement, it is not necessary to create a structure that matches the table definition one by one, and only the elements that you want to add can be added.
graph/customer.go
type CustomerWithRowNum struct {
RowNum int64 `boil:"row_num"`
boiled.Customer `boil:",bind"`
}
Next, the definition of buildSearchQueryMod (params)
is as follows.
graph/search.go
// TODO:I made it roughly for the time being. How versatile it is, such as supporting multiple tables, depends on the requirements.
func buildSearchQueryMod(p searchParam) qm.QueryMod {
if p.baseCondition == "" {
p.baseCondition = "true"
}
q := `
SELECT row_num, * FROM (
SELECT ROW_NUMBER() OVER (ORDER BY %s %s) AS row_num, *
FROM %s
WHERE %s
) AS tmp
WHERE row_num BETWEEN %d AND %d
`
sql := fmt.Sprintf(q,
p.orderKey, p.orderDirection,
p.tableName,
p.baseCondition,
p.rowNumFrom, p.rowNumTo,
)
return qm.SQL(sql)
}
Use the PostgreSQL Window function (ROW_NUMBER ()
) to assign a serial number to the result of applying the specified string search filter and sorting.
From the result, extract the desired range of ROW_NUMBER.
Now, regardless of the sort element, ascending or descending order, you can get the same mechanism for "previous page" and "next page" by specifying the range of ROW_NUMBER.
As stated in the comment, the number of results of the refined search by the search string filter is acquired, and after the page transition, whether the previous (next) page still exists (* By returning this information, the UI design on the front end , You can control the activation / inactivity of the [Prev] button and [Next] button.
/*
*Necessary for determining the existence of the next page and the previous page after paging
*For holding the number of results after applying the search string filter
*/
var totalCount int64 = 0
{
var err error
if filter == "" {
totalCount, err = boiled.Customers().Count(ctx, r.DB)
} else {
totalCount, err = boiled.Customers(qm.Where(boiled.CustomerColumns.Name+" LIKE ?",
filterWord.MatchString())).Count(ctx, r.DB)
}
if err != nil {
log.Print(err)
return nil, err
}
}
With SQL Boiler, just write boiled.Customers (). Count (ctx, r.DB)
using the auto-generated source You can get the total number of customer
tables.
If you want to add a search condition, just write it in the xxxx
part ofboiled.Customers (xxxx)
as in the above source using the description method prepared by SQL Boiler.
In the return format required by Relay, all you need is " edges
"and" pageInfo
", but due to the UI design, you usually want the number of cases, so totalCount
is also defined.
https://relay.dev/graphql/connections.htm#sec-Connection-Types
/*
*Relay return format
*/
result := &model.CustomerConnection{
TotalCount: totalCount,
}
edges
Cursor decoding is as described above, but encoding is here.
Generate a cursor from ROW_NUMBER
for each search result.
By returning this to the front end, paging can be realized on the front end side by simply adding the cursor to the parameter at the next page transition (without specifying the acquisition range in particular).
/*
*Convert search results to Edge slice format
*/
var edges []*model.CustomerEdge
for _, record := range records {
edges = append(edges, &model.CustomerEdge{
Node: &model.Customer{
ID: strconv.Itoa(int(record.ID)),
Name: record.Name,
Age: record.Age,
},
Cursor: createCursor("customer", record.RowNum),
})
}
result.Edges = edges
CustomerEdge
has the following structure.
model/models_gen.go
//Search result list (* including cursor information)
type CustomerEdge struct {
Node *Customer `json:"node"`
Cursor string `json:"cursor"`
}
The definition of createCursor (modelName, key)
is as follows.
graph/util.go
const cursorSeps = "#####"
func createCursor(modelName string, key int64) string {
return base64.RawURLEncoding.EncodeToString([]byte(fmt.Sprintf("%s%s%d", modelName, cursorSeps, key)))
}
pageInfo
Since there is information to be calculated and returned here, the processing at the front end is lightened. The following is required as page information.
model/models_gen.go
//Page information
type PageInfo struct {
//With or without next page
HasNextPage bool `json:"hasNextPage"`
//With or without previous page
HasPreviousPage bool `json:"hasPreviousPage"`
//1st record on the page
StartCursor string `json:"startCursor"`
//Last record on the page
EndCursor string `json:"endCursor"`
}
First, calculate the "total number of pages" to determine "presence or absence of next page".
//Calculate the total number of pages by this search from the total number of search results and the number of items displayed per page
totalPage := pageCondition.TotalPage(totalCount)
The definition of TotalPage (~~)
is as follows.
model/expansion.go
func (c *PageCondition) TotalPage(totalCount int64) int64 {
if c == nil {
return 0
}
var targetCount int64 = 0
if c.Backward == nil && c.Forward == nil {
targetCount = c.InitialLimit
} else {
if c.Backward != nil {
targetCount = c.Backward.Last
}
if c.Forward != nil {
targetCount = c.Forward.First
}
}
return int64(math.Ceil(float64(totalCount) / float64(targetCount)))
}
Using the above, "presence or absence of next page" can be determined as follows.
/*
*Information required for screen display and next paging on the client side
*/
pageInfo := &model.PageInfo{
HasNextPage: (totalPage - pageCondition.MoveToPageNo()) >= 1, //Is there still a page ahead after the transition?
HasPreviousPage: pageCondition.MoveToPageNo() > 1, //Is there still a previous page after the transition?
}
The definition of MoveToPageNo ()
, which is also used in the above "presence / absence of previous page" judgment, is as follows.
model/expansion.go
func (c *PageCondition) MoveToPageNo() int64 {
if c == nil {
return 1 //Initial page due to unexpected
}
if c.Backward == nil && c.Forward == nil {
return c.NowPageNo //Because it does not transition to the front or the back
}
if c.Backward != nil {
if c.NowPageNo <= 2 {
return 1
}
return c.NowPageNo - 1
}
if c.Forward != nil {
return c.NowPageNo + 1
}
return 1 //Initial page due to unexpected
}
After that, the first and last cursors are extracted separately from the record of the page displayed by this search.
if len(edges) > 0 {
pageInfo.StartCursor = edges[0].Cursor
pageInfo.EndCursor = edges[len(edges)-1].Cursor
}
result.PageInfo = pageInfo
This cursor will use " StartCursor
"when transitioning to the" previous page "at the next page transition in the front end, and" EndCursor
" when transitioning to the "next page".
PageCondition
Backward
Before ・ ・ ・ StartCursor
Forward
After ・ ・ ・ EndCursor
The source is below. https://github.com/sky0621/study-graphql/tree/v0.10.0/try01/src/frontend
This is the same structure as the article I wrote before, so the explanation is omitted. Please refer to the following. Paging implementation by Relay style in GraphQL (Part 2: Front end)
GraphQL response data on page 3
k
"With this, paging (and sorting by element and combination of character string search filters) on the page called "Customer List" has been realized for the time being. Even as a backend implementation, SQL is uniformly in the same format without having the value of the sorting element in the Cursor as in previous. I can now hit. However, mass-producing this for each function is too much for the boiler plate, so it is necessary to create a template as much as possible when actually using it. Also, I write TODO in the comments in the source, but there are various issues.
Recommended Posts