Skip to content

Commit e9d1b54

Browse files
authored
Merge pull request #260 from hack-a-chain-software/tests-refactor-3
Tests refactor 3
2 parents 0254307 + b6eff53 commit e9d1b54

10 files changed

+1295
-20
lines changed

indexer/README.md

Lines changed: 33 additions & 19 deletions
Original file line numberDiff line numberDiff line change
@@ -151,42 +151,56 @@ yarn dev:hot:graphql
151151
The following commands will aid in the maintenance of the indexer.
152152

153153
```bash
154-
# Identifying Missing Blocks - Scan for and store any blocks that were missed.
154+
# Identifying Missing Blocks - Scan for and store any blocks that were missed in the streaming process.
155155
yarn dev:missing
156156

157-
# Processing Headers - Start the header processing from S3 to the database.
158-
yarn dev:headers
159-
160-
# Processing Payloads - Start the payload processing from S3 to the database.
161-
yarn dev:payloads
162-
163157
# Update GraphQL - Makers a hot reload (without building)
164158
yarn dev:hot:graphql
165159

166160
# Generate GraphQL types - Generate the GraphQL types from the schema.
167161
yarn graphql:generate-types
162+
```
163+
164+
## 6. Running Tests
168165

169-
# Run the pagination tests offline
170-
yarn test
166+
The Kadena Indexer project includes several types of tests to ensure the functionality and reliability of the codebase. Below are the instructions to run these tests:
167+
168+
### 6.1. Unit Tests
169+
170+
Unit tests are designed to test individual components or functions in isolation. To run the unit tests, use the following command:
171+
172+
```bash
173+
yarn test:unit
171174
```
172175

173-
### 5.3. Local Workflow Testing
176+
This command will execute all the unit tests located in the `tests/unit` directory.
174177

175-
**NOTE:** This is not being actively maintained at the moment.
178+
### 6.2. Integration Tests
176179

177-
Install act for local testing:
180+
Integration tests are used to test the queries and subscriptions of the GraphQL API. To run the integration tests, use the following command:
178181

179182
```bash
180-
# For MacOS
181-
brew install act
183+
yarn test:integration
184+
```
185+
186+
This command will execute the integration tests located in the `tests/integration` directory, using the environment variables specified in the `.env.testing` file.
187+
188+
### 6.3. Specific Integration File Test
189+
190+
File tests are executed using the same environment as the integration tests. To run a specific integration test (eg. events), use the following command:
182191

183-
# For Linux
184-
sudo apt-get update
185-
sudo apt-get install act
192+
```bash
193+
yarn test:file tests/integration/events.query.test.ts
186194
```
187195

188-
Then run the indexer workflow by using the following command:
196+
This command will run tests using the environment variables from the `.env.testing` file.
197+
198+
### 6.4. Smoke Tests
199+
200+
Smoke tests are a subset of integration tests that verify the basic functionality of the application. To run the smoke tests, use the following command:
189201

190202
```bash
191-
yarn run-indexer-workflow
203+
yarn test:smoke
192204
```
205+
206+
This command will start the necessary services using Docker Compose, wait for a few seconds to ensure they are up and running, execute the smoke tests located in `tests/docker/smoke.test.ts`, and then shut down the services.

indexer/package.json

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -77,7 +77,7 @@
7777
"test:unit": "jest tests/unit/*.test.ts",
7878
"test:integration": "dotenv -e .env.testing jest tests/integration/*.test.ts",
7979
"test:file": "dotenv -e .env.testing jest",
80-
"test:smoke": "yarn compose:up && sleep 5 && jest tests/integration/smoke.test.ts && yarn compose:down",
80+
"test:smoke": "yarn compose:up && sleep 5 && jest tests/docker/smoke.test.ts && yarn compose:down",
8181
"compose:up": "docker-compose -f docker-compose.yml up -d",
8282
"compose:down": "docker-compose -f docker-compose.yml down"
8383
}
Lines changed: 90 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,90 @@
1+
import { gql } from 'graphql-request';
2+
3+
export const getCompletedBlockHeightsQuery = (): string => {
4+
const queryGql = gql`
5+
query {
6+
completedBlockHeights {
7+
edges {
8+
cursor
9+
node {
10+
chainId
11+
creationTime
12+
difficulty
13+
epoch
14+
events {
15+
totalCount
16+
pageInfo {
17+
endCursor
18+
hasNextPage
19+
hasPreviousPage
20+
startCursor
21+
}
22+
edges {
23+
cursor
24+
node {
25+
chainId
26+
height
27+
id
28+
moduleName
29+
name
30+
orderIndex
31+
parameters
32+
parameterText
33+
qualifiedName
34+
requestKey
35+
}
36+
}
37+
}
38+
flags
39+
hash
40+
height
41+
id
42+
minerAccount {
43+
accountName
44+
balance
45+
chainId
46+
fungibleName
47+
guard {
48+
... on KeysetGuard {
49+
keys
50+
predicate
51+
raw
52+
}
53+
}
54+
id
55+
}
56+
neighbors {
57+
chainId
58+
hash
59+
}
60+
nonce
61+
parent {
62+
chainId
63+
}
64+
payloadHash
65+
powHash
66+
target
67+
transactions {
68+
totalCount
69+
pageInfo {
70+
endCursor
71+
hasNextPage
72+
hasPreviousPage
73+
startCursor
74+
}
75+
edges {
76+
cursor
77+
node {
78+
id
79+
}
80+
}
81+
}
82+
weight
83+
}
84+
}
85+
}
86+
}
87+
`;
88+
89+
return queryGql;
90+
};
Lines changed: 31 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,31 @@
1+
import { gql } from 'graphql-request';
2+
3+
export const getEventsSubscriptionQuery = (params: {
4+
qualifiedEventName: string;
5+
chainId?: string;
6+
minimumDepth?: number;
7+
}): string => {
8+
const { qualifiedEventName, chainId, minimumDepth } = params;
9+
10+
const queryGql = gql`
11+
subscription {
12+
events(
13+
qualifiedEventName: "${qualifiedEventName}"
14+
${chainId ? `chainId: "${chainId}"` : ''}
15+
${minimumDepth ? `minimumDepth: ${minimumDepth}` : ''}
16+
) {
17+
id
18+
chainId
19+
height
20+
moduleName
21+
name
22+
orderIndex
23+
parameters
24+
qualifiedName
25+
requestKey
26+
}
27+
}
28+
`;
29+
30+
return queryGql;
31+
};
Lines changed: 98 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,98 @@
1+
import { gql } from 'graphql-request';
2+
3+
export const getNewBlocksFromDepthSubscriptionQuery = (params: {
4+
minimumDepth?: number;
5+
chainIds?: string[];
6+
}): string => {
7+
const query = Object.entries(params)
8+
.filter(([_, value]) => value !== undefined && value !== null)
9+
.map(([key, value]) => {
10+
if (Array.isArray(value)) {
11+
return `${key}: [${value.map(v => `"${v}"`).join(', ')}]`;
12+
}
13+
return `${key}: ${typeof value === 'string' ? `"${value}"` : value}`;
14+
})
15+
.join(', ');
16+
17+
const queryGql = gql`
18+
subscription {
19+
newBlocksFromDepth${query ? `(${query})` : ''} {
20+
chainId
21+
creationTime
22+
difficulty
23+
epoch
24+
events {
25+
totalCount
26+
pageInfo {
27+
endCursor
28+
hasNextPage
29+
hasPreviousPage
30+
startCursor
31+
}
32+
edges {
33+
cursor
34+
node {
35+
chainId
36+
height
37+
id
38+
moduleName
39+
name
40+
orderIndex
41+
parameters
42+
parameterText
43+
qualifiedName
44+
requestKey
45+
}
46+
}
47+
}
48+
flags
49+
hash
50+
height
51+
id
52+
minerAccount {
53+
accountName
54+
balance
55+
chainId
56+
fungibleName
57+
guard {
58+
... on KeysetGuard {
59+
keys
60+
predicate
61+
raw
62+
}
63+
}
64+
id
65+
}
66+
neighbors {
67+
chainId
68+
hash
69+
}
70+
nonce
71+
parent {
72+
chainId
73+
}
74+
payloadHash
75+
powHash
76+
target
77+
transactions {
78+
totalCount
79+
pageInfo {
80+
endCursor
81+
hasNextPage
82+
hasPreviousPage
83+
startCursor
84+
}
85+
edges {
86+
cursor
87+
node {
88+
id
89+
}
90+
}
91+
}
92+
weight
93+
}
94+
}
95+
`;
96+
97+
return queryGql;
98+
};

0 commit comments

Comments
 (0)