Skip to content

Commit c5635d5

Browse files
committed
[migration] Run migration for latest content
1 parent eaeb6f0 commit c5635d5

File tree

12 files changed

+547
-101
lines changed

12 files changed

+547
-101
lines changed

src/content/docs/build/indexer/indexer-sdk/documentation/create-processor.mdx

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -6,7 +6,7 @@ This guide will walk you through setting up the basic template for a new process
66

77
## Pre-requisites
88

9-
You've already set up your environment and have the Indexer SDK `aptos-indexer-sdk` and `aptos-indexer-sdk-server-framework` installed.
9+
You've already set up your environment and have the Indexer SDK `aptos-indexer-sdk` installed.
1010
If you haven't, follow the [Indexer SDK installation guide](/build/indexer/indexer-sdk/documentation/setup).
1111

1212
## Overview
@@ -25,9 +25,9 @@ The next section goes through each of these pieces more explicitly and provides
2525
The `IndexerProcessorConfig` defines the base configuration for all processors that you'll be running.
2626
It should include configuration for things that are shared across multiple processors, like the database configuration and [Transaction Stream](/build/indexer/txn-stream) configuration.
2727

28-
[`ServerArgs`](https://github.com/aptos-labs/aptos-indexer-processor-sdk/blob/main/aptos-indexer-processors-sdk/sdk-server-framework/src/lib.rs#L26) parses a `config.yaml` file and bootstraps a server with all the common pieces to run a processor.
28+
`ServerArgs` parses a `config.yaml` file and bootstraps a server with all the common pieces to run a processor.
2929

30-
To setup the configuration for your processor and make it work with `ServerArgs`, you'll need to define a `IndexerProcessorConfig` that implements the [`RunnableConfig`](https://github.com/aptos-labs/aptos-indexer-processor-sdk/blob/main/aptos-indexer-processors-sdk/sdk-server-framework/src/lib.rs#L102) trait.
30+
To setup the configuration for your processor and make it work with `ServerArgs`, you'll need to define a `IndexerProcessorConfig` that implements the `RunnableConfig` trait.
3131
It also triggers a run method, which can be invoked in `main.rs`.
3232

3333
For basic cases, you can copy the [`IndexerProcessorConfig` from the `aptos-indexer-processor-example`](https://github.com/aptos-labs/aptos-indexer-processor-example/blob/main/aptos-indexer-processor-example/src/config/indexer_processor_config.rs) repository and modify it to fit your needs.

src/content/docs/build/indexer/indexer-sdk/documentation/setup.mdx

Lines changed: 6 additions & 11 deletions
Original file line numberDiff line numberDiff line change
@@ -7,21 +7,16 @@ The quickstart guide provides a template processor and includes all of this setu
77

88
If you're migrating an existing processor to the Indexer SDK, follow the steps below.
99

10-
The Indexer SDK provides several Rust crates:
11-
12-
1. [`aptos-indexer-processor-sdk`](https://github.com/aptos-labs/aptos-indexer-processor-sdk/tree/main/aptos-indexer-processors-sdk/sdk) - The core SDK that provides the building blocks for writing a processor.
13-
2. [`aptos-indexer-processor-sdk-server-framework`](https://github.com/aptos-labs/aptos-indexer-processor-sdk/tree/main/aptos-indexer-processors-sdk/sdk-server-framework) - A server framework for creating a server that runs the processor and includes health checks and metrics logging probes.
14-
If you're setting up a server to host your processor, you will need to include this crate.
15-
3. [`aptos-indexer-testing-framework`](https://github.com/aptos-labs/aptos-indexer-processor-sdk/tree/main/aptos-indexer-processors-sdk/testing-framework) - An e2e testing framework for testing processors.
16-
If you want to write tests for your processor, you will need to include this crate.
17-
18-
Depending on your use case, you can import the crates to your `Cargo.toml`.
10+
Add `aptos-indexer-processor-sdk` to your `Cargo.toml`.
1911

2012
```toml
2113
[dependencies]
2214
aptos-indexer-processor-sdk = { git = "https://github.com/aptos-labs/aptos-indexer-processor-sdk.git", rev = "aptos-indexer-processor-sdk-v1.0.0" }
23-
aptos-indexer-processor-sdk-server-framework = { git = "https://github.com/aptos-labs/aptos-indexer-processor-sdk.git", rev = "aptos-indexer-processor-sdk-v1.0.0" }
24-
aptos-indexer-testing-framework = { git = "https://github.com/aptos-labs/aptos-indexer-processor-sdk.git", rev = "aptos-indexer-processor-sdk-v1.0.0" }
2515
```
2616

17+
`aptos-indexer-processor-sdk` includes the following features:
18+
19+
1. `postgres_full` - Interface layer to integrate Postgres with your processor.
20+
2. `testing_framework` - An e2e testing framework for testing processors. If you want to write tests for your processor, add this feature to the crate.
21+
2722
{/* <!-- Add list of SDK releases once we have that --> */}

src/content/docs/build/indexer/indexer-sdk/quickstart.mdx

Lines changed: 139 additions & 62 deletions
Original file line numberDiff line numberDiff line change
@@ -8,17 +8,17 @@ This guide will walk you through setting up and running a Rust processor to inde
88
We provide a template processor that you can customize to index events from your custom contracts.
99
By the end of the guide, you should have a basic understanding of how a processor works and be able to customize the processor for your indexing needs.
1010

11-
## Getting started
11+
## Get started
1212

1313
To get started, clone
14-
the [aptos-indexer-processors-example](https://github.com/aptos-labs/aptos-indexer-processor-example/tree/main) repo.
14+
the [aptos-indexer-processor-sdk](https://github.com/aptos-labs/aptos-indexer-processor-sdk) repo.
1515

1616
```text
1717
# HTTPS
18-
https://github.com/aptos-labs/aptos-indexer-processor-example.git
18+
https://github.com/aptos-labs/aptos-indexer-processor-sdk.git
1919
2020
# SSH
21-
[email protected]:aptos-labs/aptos-indexer-processor-example.git
21+
[email protected]:aptos-labs/aptos-indexer-processor-sdk.git
2222
```
2323

2424
Processors consume transactions from the Transaction Stream Service. In order to use the Labs-Hosted Transaction Stream
@@ -63,7 +63,7 @@ export CPPFLAGS="-I/opt/homebrew/opt/libpq/include"
6363
- To easily view your database data, consider using a GUI like [DBeaver](https://dbeaver.io/)
6464
_recommended_, [pgAdmin](https://www.pgadmin.org/), or [Postico](https://eggerapps.at/postico2/).
6565

66-
## Setting up your environment
66+
## Set up your environment
6767

6868
Make sure to start the `postgresql` service:
6969

@@ -79,27 +79,25 @@ For mac, if you’re using brew, start it up with:
7979
brew services start postgresql
8080
```
8181

82-
## **Configuring your processor**
82+
## **Configure your processor**
8383

8484
Now let’s set up the configuration details for the actual indexer processor we’re going to use.
8585

8686
### **Set up your config.yaml file**
8787

88-
In the example repo, there is a sample config.yaml file that should look something like this:
88+
In the example folder, there is a sample config.yaml file that should look something like this:
8989

9090
```yaml
91+
# This is a template yaml for the processor
9192
health_check_port: 8085
9293
server_config:
93-
processor_config:
94-
type: "events_processor"
9594
transaction_stream_config:
96-
indexer_grpc_data_service_address: "https://grpc.testnet.aptoslabs.com:443"
97-
starting_version: 0
98-
# request_ending_version: 10000
95+
indexer_grpc_data_service_address: "https://grpc.mainnet.aptoslabs.com:443"
9996
auth_token: "AUTH_TOKEN"
10097
request_name_header: "events-processor"
101-
db_config:
102-
postgres_connection_string: postgresql://postgres:@localhost:5432/example
98+
starting_version: 0
99+
postgres_config:
100+
connection_string: postgresql://postgres:@localhost:5432/example
103101
```
104102
105103
Open the `config.yaml` file and update these fields:
@@ -137,16 +135,36 @@ indexer_grpc_data_service_address: grpc.testnet.aptoslabs.com:443
137135
indexer_grpc_data_service_address: grpc.mainnet.aptoslabs.com:443
138136
```
139137

140-
## Explanation of the processor
138+
In this tutorial, we are using `testnet` so update the `indexer_grpc_data_service_address` to `grpc.testnet.aptoslabs.com:443`.
139+
140+
## Create the events processor
141141

142142
At a high level, each processor is responsible for receiving a stream of transactions, parsing and transforming the
143143
relevant data, and storing the data into a database.
144144

145-
### Defining the events database model
145+
### Define the database schema
146+
147+
In `src/db/migrations`, you will see the events migration, which defines the database schema that will be used to store the events.
148+
149+
```sql up.sql
150+
CREATE TABLE events (
151+
sequence_number BIGINT NOT NULL,
152+
creation_number BIGINT NOT NULL,
153+
account_address VARCHAR(66) NOT NULL,
154+
transaction_version BIGINT NOT NULL,
155+
transaction_block_height BIGINT NOT NULL,
156+
type TEXT NOT NULL,
157+
data JSONB NOT NULL,
158+
inserted_at TIMESTAMP NOT NULL DEFAULT NOW(),
159+
event_index BIGINT NOT NULL,
160+
indexed_type VARCHAR(300) NOT NULL,
161+
PRIMARY KEY (transaction_version, event_index)
162+
);
163+
```
146164

147-
In `src/db/postgres/schema.rs` , you will see events table which has the following schema:
165+
When you apply migrations, diesel will re-generate the `schema.rs` file, which looks like this:
148166

149-
```rust
167+
```rust schema.rs
150168
diesel::table! {
151169
events (transaction_version, event_index) {
152170
sequence_number -> Int8,
@@ -166,43 +184,46 @@ diesel::table! {
166184
}
167185
```
168186

169-
The events schema represents the data that this processor is indexing. This [`schema.rs`](http://schema.rs) file is an
170-
autogenerated from the database migrations. In the next section, we’ll go over how these migrations are run.
171-
172-
There are two other important tables:
187+
In `schema.rs`, you'll see two other important tables:
173188

174189
- `ledger_infos` which tracks the chain id of the ledger being indexed
175190
- `processor_status` which tracks the `last_success_version` of the processor
176191

177-
### Defining the events processor
178-
179-
The file `src/processors/events/events_processor.rs` contains the code which defines the events processor. Inside of
180-
`run_processor` there are a few key components:
181-
182-
1. First, we setup the processor:
183-
1. `run_migrations` automatically runs the database migrations defined in `src/db/postgres/migrations`
184-
2. We merge the starting version in `config.yaml` and the `processor_status.last_success_version` in the database
185-
to get the final starting version for the processor. This allows us to restart the processor from a previously
186-
processed version.
187-
3. We check the `ledger_infos.chain_id` to make sure the processor is indexing the correct chain
188-
2. Next, we instantiate the processor steps. Here we explain the purpose of each step:
189-
1. `TransactionStreamStep` provides a stream of transactions to the processor
190-
2. `EventsExtractor` extracts events data from each transaction
191-
3. `EventsStorer` inserts the extracted events into the `events` table
192-
4. `LatestVersionTracker` keeps track of the latest processed version and updates the `processor_status` table
193-
3. Lastly, we connect the processor steps together.
194-
1. `ProcessorBuilder::new_with_inputless_first_step` takes in the first step of the processor. In most cases, the
195-
first step is a `TransactionStreamStep` .
196-
2. The rest of the steps are connected with `connect_to`. `connect_to` creates a channel between the steps so the
197-
output of one step becomes the input of the next step.
198-
3. And then we end the builder with `end_and_return_output_receiver`.
199-
200-
# Running the processor
192+
### Define the processing logic
193+
194+
The file `src/main.rs` contains the code which defines the events processor. The key components are:
195+
196+
1. `insert_events_query` defines the diesel query to insert events into the database.
197+
```rust
198+
fn insert_events_query(
199+
items_to_insert: Vec<EventModel>,
200+
) -> impl QueryFragment<Pg> + diesel::query_builder::QueryId + Send {
201+
use crate::schema::events::dsl::*;
202+
diesel::insert_into(crate::schema::events::table)
203+
.values(items_to_insert)
204+
.on_conflict((transaction_version, event_index))
205+
.do_nothing()
206+
}
207+
```
208+
2. `process` is a helper function that wraps around a regular processor.
209+
In the background, this powerful function handles connecting to Transaction Stream, processing transactions given a transform function that you define, applying database migrations, and tracking the processor's status.
210+
211+
```rust
212+
process(
213+
"events_processor".to_string(), // name of the processor that will be used to track the processor status
214+
MIGRATIONS, // migrations to be applied to the database
215+
async |transactions, conn_pool| {
216+
// transform from transaction to events and insert the events into the database
217+
},
218+
).await?;
219+
```
220+
221+
## Run the processor
201222

202223
With the `config.yaml` you created earlier, you’re ready to run the events processor:
203224

204225
```shellscript
205-
cd aptos-indexer-processor-example
226+
cd examples/postgres-basic-events-example
206227
cargo run --release -- -c config.yaml
207228
```
208229

@@ -214,30 +235,86 @@ You should see the processor start to index Aptos blockchain events!
214235
{"timestamp":"2024-08-15T01:06:35.257801Z","level":"INFO","message":"Finished processing events from versions [0, 4999]","filename":"src/processors/events/events_processor.rs","line_number":90,"threadName":"tokio-runtime-worker","threadId":"ThreadId(17)"}
215236
```
216237

217-
# Customizing the processor
238+
## Customize the processor
218239

219240
In most cases, you want to index events from your own contracts. The example processor offers a good starting point to
220241
creating your own custom processor.
221242

222-
To customize the processor to index events from your custom contract, you can make change in these places:
243+
To customize the processor to index events from your custom contract, you can make these changes:
223244

224-
- `EventsExtractor`
225-
- In `process()`, you can filter by specific event types and extract specific event data from your custom contract
226-
- `EventsStorer`
245+
1. Change the database schema to a format that better matches your dapp or API.
246+
a. Create a new migration with diesel:
227247

228-
- If you need to change the database model, you can generate a new database migration by going to `src/db/postgres`
229-
and running
230-
231-
```shellscript
248+
```shellscript
232249
diesel migration generate {migration_name}
233-
```
250+
```
234251

235-
- Add your migration changes to `up.sql` and `down.sql`, then run
252+
b. Add your migration changes to `up.sql` and `down.sql`, then apply the migration:
236253

237-
```shellscript
254+
```shellscript
238255
diesel migration run --database-url={YOUR_DATABASE_URL}
239-
```
256+
```
257+
258+
c. The `schema.rs` file will be updated automatically. You can then create a diesel query that uses the new schema.
259+
2\. Update the transform logic in `process()`. You can filter by specific event types and extract specific event data from your custom contract
240260

241-
to update `schema.rs`.
261+
## Migrate from legacy processors
262+
263+
If you're migrating from the legacy processors, you can still start with the same steps above to create a new processor with the Indexer SDK.
264+
265+
You'll also need to follow these:
266+
267+
1. Copy your migration files to `src/db/`.
268+
2. With the legacy processors, the processing logic is defined inside the `process_transactions` method.
269+
270+
```rust
271+
// Example with the legacy processors
272+
#[async_trait]
273+
impl ProcessorTrait for EventsProcessor {
274+
async fn process_transactions(
275+
...
276+
) -> anyhow::Result<ProcessingResult> {
277+
// Extract events from transactions
278+
let events: Vec<EventModel> = process_events(transactions);
279+
280+
// Store the events in the database
281+
let tx_result = insert_to_db(
282+
self.get_pool(),
283+
self.name(),
284+
start_version,
285+
end_version,
286+
&events,
287+
&self.per_table_chunk_sizes,
288+
)
289+
.await;
290+
291+
return tx_result;
292+
}
293+
}
294+
```
295+
296+
Migrate to the SDK by copying over the logic in `process_transactions` method to the SDK `process` transform function.
297+
298+
```rust
299+
// Example with SDK processor
300+
process(
301+
"events_processor".to_string(),
302+
MIGRATIONS,
303+
async |transactions, conn_pool| {
304+
// Extract events from transactions
305+
let events: Vec<EventModel> = process_events(transactions);
306+
307+
// Store events in the database
308+
let execute_res = execute_in_chunks(
309+
conn_pool.clone(),
310+
insert_events_query,
311+
&events,
312+
MAX_DIESEL_PARAM_SIZE / EventModel::field_count(),
313+
)
314+
.await;
315+
},
316+
)
317+
.await?;
318+
```
242319

243-
- And then update the `EventsStorer.process()` to handle storing the events data to the updated database model
320+
3. Update the `config.yaml` file to the new format. Update `starting_version` to the version that is last saved in the `processor_status` table.

src/content/docs/build/sdks/ts-sdk/building-transactions/script-composer.mdx

Lines changed: 1 addition & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -1,9 +1,7 @@
11
---
2-
title: "Dynamically invoke chains of Move calls with ScriptComposer"
2+
title: "Invoke chains of Move calls with Dynamic Script Composer"
33
---
44

5-
This is so far an **experimental** feature that is only released to [this particular](https://www.npmjs.com/package/@aptos-labs/ts-sdk/v/1.33.0-sc.0) version of typescript SDK and is subject to change.
6-
75
In the naive api, you only get to specify one entry function to invoke for one transaction. An advanced builder might want to be able to invoke multiple **public** Move functions inside one transaction. This is now enabled by the new `scriptComposer` api provided in the transaction builder.
86

97
Here's an example of how you can invoke the api:

0 commit comments

Comments
 (0)