Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

lightning: set the 17th bit of the txn_source to indicate the write source is lightning physical mode import #57706

Open
wants to merge 5 commits into
base: master
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from 2 commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
3 changes: 3 additions & 0 deletions pkg/kv/option.go
Original file line number Diff line number Diff line change
Expand Up @@ -240,6 +240,9 @@ const (
LossyDDLColumnReorgSource = 1
lossyDDLReorgSourceMax = (1 << lossyDDLReorgSourceBits) - 1
lossyDDLReorgSourceShift = cdcWriteSourceBits

// LightningPhysicalImportTxnSource the 17th bit is set as the txn source for Lightning physical import.
LightningPhysicalImportTxnSource = 1 << 16
)

// SetCDCWriteSource sets the TiCDC write source in the txnSource.
Expand Down
27 changes: 16 additions & 11 deletions pkg/lightning/backend/local/region_job.go
Original file line number Diff line number Diff line change
Expand Up @@ -314,6 +314,21 @@ func (local *Backend) writeToTiKV(ctx context.Context, j *regionJob) error {
return err
}

func newWriteRequest(meta *sst.SSTMeta, ResourceGroupName, TaskType string) *sst.WriteRequest {
return &sst.WriteRequest{
Chunk: &sst.WriteRequest_Meta{
Meta: meta,
},
Context: &kvrpcpb.Context{
ResourceControlContext: &kvrpcpb.ResourceControlContext{
ResourceGroupName: ResourceGroupName,
},
RequestSource: util.BuildRequestSource(true, kv.InternalTxnLightning, TaskType),
TxnSource: kv.LightningPhysicalImportTxnSource,
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

please post manual test result. I'm not sure TiKV import service will use this value.

And maybe we can add a SST table / block property to skip scanning the whole SST to reduce IO.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

maybe we can add a SST table / block property to skip scanning the whole SST to reduce IO.

It sounds great to do so, but looks more complicated since we have to consider the case such as region merge or split? Also, is it possible that there is one sst that contains lightning physical imported data and normal inserted data at the same time?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

please post manual test result. I'm not sure TiKV import service will use this value.

Yes, this PR should be get merged after test, I will post the manual test result.

https://github.com/tikv/tikv/pull/17895/files This PR set the txn_source to initialize the txn_sst_writer

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

maybe we can add a SST table / block property to skip scanning the whole SST to reduce IO.

It sounds great to do so, but looks more complicated since we have to consider the case such as region merge or split? Also, is it possible that there is one sst that contains lightning physical imported data and normal inserted data at the same time?

Yes, just as a fast path. If SST merging happened we have to scan all KV.

},
}
}

func (local *Backend) doWrite(ctx context.Context, j *regionJob) error {
if j.stage != regionScanned {
return nil
Expand Down Expand Up @@ -396,17 +411,7 @@ func (local *Backend) doWrite(ctx context.Context, j *regionJob) error {
leaderID := j.region.Leader.GetId()
clients := make([]sst.ImportSST_WriteClient, 0, len(region.GetPeers()))
allPeers := make([]*metapb.Peer, 0, len(region.GetPeers()))
req := &sst.WriteRequest{
Chunk: &sst.WriteRequest_Meta{
Meta: meta,
},
Context: &kvrpcpb.Context{
ResourceControlContext: &kvrpcpb.ResourceControlContext{
ResourceGroupName: local.ResourceGroupName,
},
RequestSource: util.BuildRequestSource(true, kv.InternalTxnLightning, local.TaskType),
},
}
req := newWriteRequest(meta, local.ResourceGroupName, local.TaskType)
for _, peer := range region.GetPeers() {
cli, err := clientFactory.create(ctx, peer.StoreId)
if err != nil {
Expand Down
6 changes: 6 additions & 0 deletions pkg/lightning/backend/local/region_job_test.go
Original file line number Diff line number Diff line change
Expand Up @@ -16,6 +16,7 @@ package local

import (
"context"
"github.com/pingcap/tidb/pkg/kv"
"math/rand"
"sync"
"testing"
Expand Down Expand Up @@ -536,6 +537,11 @@ func TestCancelBalancer(t *testing.T) {
jobWg.Wait()
}

func TestNewWriteRequest(T *testing.T) {
req := newWriteRequest(&sst.SSTMeta{}, "", "")
require.Equal(T, req.Context.TxnSource, uint64(kv.LightningPhysicalImportTxnSource))
}

func TestStoreBalancerNoRace(t *testing.T) {
jobToWorkerCh := make(chan *regionJob)
jobFromWorkerCh := make(chan *regionJob)
Expand Down