source commons files source engines files source kubeblocks files source kubedb files CLUSTER_NAME: `kubectl get namespace | grep ns-niaem ` `kubectl create namespace ns-niaem` namespace/ns-niaem created create namespace ns-niaem done download kbcli `gh release list --repo apecloud/kbcli --limit 100 | (grep "1.0" || true)` `curl -fsSL https://kubeblocks.io/installer/install_cli.sh | bash -s v1.0.1` Your system is linux_amd64 Installing kbcli ... Downloading ... % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 33.6M 100 33.6M 0 0 175M 0 --:--:-- --:--:-- --:--:-- 175M kbcli installed successfully. Kubernetes: v1.32.6 KubeBlocks: 1.0.1 kbcli: 1.0.1 Make sure your docker service is running and begin your journey with kbcli: kbcli playground init For more information on how to get started, please visit: https://kubeblocks.io download kbcli v1.0.1 done Kubernetes: v1.32.6 KubeBlocks: 1.0.1 kbcli: 1.0.1 Kubernetes Env: v1.32.6 check snapshot controller check snapshot controller done POD_RESOURCES: aks kb-default-sc found aks default-vsc found found default storage class: default KubeBlocks version is:1.0.1 skip upgrade KubeBlocks current KubeBlocks version: 1.0.1 Error: no repositories to show helm repo add chaos-mesh https://charts.chaos-mesh.org "chaos-mesh" has been added to your repositories add helm chart repo chaos-mesh success chaos mesh already installed check component definition check component definition check component definition check component definition set component name:etcd set etcd component definition set etcd component definition etcd-3-1.0.1 LIMIT_CPU:0.1 LIMIT_MEMORY:0.5 storage size: 1 CLUSTER_NAME:etcdm-mgvllt No resources found in ns-niaem namespace. pod_info: create 1 replica WipeOut etcd cluster check component definition set component definition by component version no component definitions found apiVersion: apps.kubeblocks.io/v1 kind: Cluster metadata: name: etcdm-mgvllt namespace: ns-niaem spec: terminationPolicy: WipeOut componentSpecs: - name: etcd componentDef: etcd-3-1.0.1 tls: false replicas: 1 resources: requests: cpu: 100m memory: 0.5Gi limits: cpu: 100m memory: 0.5Gi volumeClaimTemplates: - name: data spec: storageClassName: accessModes: - ReadWriteOnce resources: requests: storage: 1Gi services: - name: client serviceName: client spec: type: NodePort ports: - port: 2379 targetPort: 2379 componentSelector: etcd roleSelector: leader `kubectl apply -f test_create_etcdm-mgvllt.yaml` cluster.apps.kubeblocks.io/etcdm-mgvllt created apply test_create_etcdm-mgvllt.yaml Success `rm -rf test_create_etcdm-mgvllt.yaml` check cluster status `kbcli cluster list etcdm-mgvllt --show-labels --namespace ns-niaem ` NAME NAMESPACE CLUSTER-DEFINITION TERMINATION-POLICY STATUS CREATED-TIME LABELS etcdm-mgvllt ns-niaem WipeOut Creating Sep 11,2025 17:27 UTC+0800 cluster_status:Creating cluster_status:Creating cluster_status:Creating cluster_status:Creating check cluster status done cluster_status:Running check pod status `kbcli cluster list-instances etcdm-mgvllt --namespace ns-niaem ` NAME NAMESPACE CLUSTER COMPONENT STATUS ROLE ACCESSMODE AZ CPU(REQUEST/LIMIT) MEMORY(REQUEST/LIMIT) STORAGE NODE CREATED-TIME etcdm-mgvllt-etcd-0 ns-niaem etcdm-mgvllt etcd Running leader 0 100m / 100m 512Mi / 512Mi data:1Gi aks-cicdamdpool-12089392-vmss000000/10.224.0.5 Sep 11,2025 17:27 UTC+0800 check pod status done No resources found in ns-niaem namespace. check cluster connect `echo 'etcdctl --endpoints=http://etcdm-mgvllt-client.ns-niaem.svc.cluster.local:2379 endpoint health' | kubectl exec -it etcdm-mgvllt-etcd-0 --namespace ns-niaem -- bash` check cluster connect done `kubectl get secrets -l app.kubernetes.io/instance=etcdm-mgvllt` No resources found in ns-niaem namespace. Not found cluster secret DB_USERNAME:;DB_PASSWORD:;DB_PORT:2379;DB_DATABASE: There is no password in Type: 15. check component definition set component name:kafka-combine LIMIT_CPU:0.5 LIMIT_MEMORY:1 storage size: 5 CLUSTER_NAME:kafkam-mgvllt No resources found in ns-niaem namespace. pod_info: create 1 replica WipeOut kafka cluster check component definition set component definition by component version no component definitions found apiVersion: apps.kubeblocks.io/v1 kind: Cluster metadata: name: kafkam-mgvllt namespace: ns-niaem annotations: "kubeblocks.io/extra-env": '***"KB_KAFKA_ENABLE_SASL":"false","KB_KAFKA_BROKER_HEAP":"-XshowSettings:vm -XX:MaxRAMPercentage=100 -Ddepth=64","KB_KAFKA_CONTROLLER_HEAP":"-XshowSettings:vm -XX:MaxRAMPercentage=100 -Ddepth=64","KB_KAFKA_PUBLIC_ACCESS":"false"***' spec: clusterDef: kafka topology: combined terminationPolicy: WipeOut componentSpecs: - name: kafka-combine tls: false disableExporter: true replicas: 1 serviceVersion: 3.3.2 services: - name: advertised-listener serviceType: ClusterIP podService: true resources: requests: cpu: 500m memory: 1Gi limits: cpu: 500m memory: 1Gi env: - name: KB_BROKER_DIRECT_POD_ACCESS value: "false" - name: KB_KAFKA_ENABLE_SASL_SCRAM value: "false" volumeClaimTemplates: - name: data spec: storageClassName: accessModes: - ReadWriteOnce resources: requests: storage: 5Gi - name: metadata spec: storageClassName: accessModes: - ReadWriteOnce resources: requests: storage: 5Gi `kubectl apply -f test_create_kafkam-mgvllt.yaml` cluster.apps.kubeblocks.io/kafkam-mgvllt created apply test_create_kafkam-mgvllt.yaml Success `rm -rf test_create_kafkam-mgvllt.yaml` check cluster status `kbcli cluster list kafkam-mgvllt --show-labels --namespace ns-niaem ` NAME NAMESPACE CLUSTER-DEFINITION TERMINATION-POLICY STATUS CREATED-TIME LABELS kafkam-mgvllt ns-niaem kafka WipeOut Creating Sep 11,2025 17:29 UTC+0800 clusterdefinition.kubeblocks.io/name=kafka cluster_status:Creating cluster_status:Creating cluster_status:Creating check cluster status done cluster_status:Running check pod status `kbcli cluster list-instances kafkam-mgvllt --namespace ns-niaem ` NAME NAMESPACE CLUSTER COMPONENT STATUS ROLE ACCESSMODE AZ CPU(REQUEST/LIMIT) MEMORY(REQUEST/LIMIT) STORAGE NODE CREATED-TIME kafkam-mgvllt-kafka-combine-0 ns-niaem kafkam-mgvllt kafka-combine Running 0 500m / 500m 1Gi / 1Gi data:5Gi aks-cicdamdpool-12089392-vmss000001/10.224.0.7 Sep 11,2025 17:29 UTC+0800 metadata:5Gi check pod status done connect unsupported engine type: kafka kafka `kubectl get secrets -l app.kubernetes.io/instance=kafkam-mgvllt` No resources found in ns-niaem namespace. Not found cluster secret DB_USERNAME:;DB_PASSWORD:;DB_PORT:9092;DB_DATABASE: There is no password in Type: 7. check component definition set component name:kafka-combine LIMIT_CPU:0.5 LIMIT_MEMORY:1 storage size: 5 CLUSTER_NAME:kafkam-mgvllt pod_info:kafkam-mgvllt-kafka-combine-0 2/2 Running 0 72s get cluster version get cluster component definition get cluster component service version get service version: 3.3.2 get cluster component name set component name:kafka-combine get cluster storage size set cluster mode:separated cluster is exists, skip create kafkam-mgvllt check cluster status `kbcli cluster list kafkam-mgvllt --show-labels --namespace ns-niaem ` NAME NAMESPACE CLUSTER-DEFINITION TERMINATION-POLICY STATUS CREATED-TIME LABELS kafkam-mgvllt ns-niaem kafka WipeOut Running Sep 11,2025 17:29 UTC+0800 clusterdefinition.kubeblocks.io/name=kafka check cluster status done cluster_status:Running check pod status `kbcli cluster list-instances kafkam-mgvllt --namespace ns-niaem ` NAME NAMESPACE CLUSTER COMPONENT STATUS ROLE ACCESSMODE AZ CPU(REQUEST/LIMIT) MEMORY(REQUEST/LIMIT) STORAGE NODE CREATED-TIME kafkam-mgvllt-kafka-combine-0 ns-niaem kafkam-mgvllt kafka-combine Running 0 500m / 500m 1Gi / 1Gi data:5Gi aks-cicdamdpool-12089392-vmss000001/10.224.0.7 Sep 11,2025 17:29 UTC+0800 metadata:5Gi check pod status done connect unsupported engine type: kafka kafka-combine-1.0.1 check component definition set component name:minio set minio component definition set minio component definition minio-1.0.1 LIMIT_CPU:0.5 LIMIT_MEMORY:1 storage size: 5 CLUSTER_NAME:miniom-mgvllt No resources found in ns-niaem namespace. pod_info: create 2 replica WipeOut minio cluster check component definition set component definition by component version no component definitions found apiVersion: apps.kubeblocks.io/v1 kind: Cluster metadata: name: miniom-mgvllt namespace: ns-niaem spec: terminationPolicy: WipeOut componentSpecs: - name: minio componentDef: minio-1.0.1 replicas: 2 env: - name: MINIO_BUCKETS value: test disableExporter: true resources: limits: cpu: 500m memory: 1Gi requests: cpu: 500m memory: 1Gi volumeClaimTemplates: - name: data spec: storageClassName: accessModes: - ReadWriteOnce resources: requests: storage: 5Gi `kubectl apply -f test_create_miniom-mgvllt.yaml` cluster.apps.kubeblocks.io/miniom-mgvllt created apply test_create_miniom-mgvllt.yaml Success `rm -rf test_create_miniom-mgvllt.yaml` check cluster status `kbcli cluster list miniom-mgvllt --show-labels --namespace ns-niaem ` NAME NAMESPACE CLUSTER-DEFINITION TERMINATION-POLICY STATUS CREATED-TIME LABELS miniom-mgvllt ns-niaem WipeOut Creating Sep 11,2025 17:30 UTC+0800 cluster_status:Creating cluster_status:Creating cluster_status:Creating cluster_status:Creating cluster_status:Creating cluster_status:Creating cluster_status:Creating cluster_status:Creating cluster_status:Creating cluster_status:Creating cluster_status:Creating cluster_status:Creating cluster_status:Creating cluster_status:Creating cluster_status:Creating cluster_status:Creating cluster_status:Creating check cluster status done cluster_status:Running check pod status `kbcli cluster list-instances miniom-mgvllt --namespace ns-niaem ` NAME NAMESPACE CLUSTER COMPONENT STATUS ROLE ACCESSMODE AZ CPU(REQUEST/LIMIT) MEMORY(REQUEST/LIMIT) STORAGE NODE CREATED-TIME miniom-mgvllt-minio-0 ns-niaem miniom-mgvllt minio Running readwrite 0 500m / 500m 1Gi / 1Gi data:5Gi aks-cicdamdpool-12089392-vmss000003/10.224.0.8 Sep 11,2025 17:30 UTC+0800 miniom-mgvllt-minio-1 ns-niaem miniom-mgvllt minio Running notready 0 500m / 500m 1Gi / 1Gi data:5Gi aks-cicdamdpool-12089392-vmss000000/10.224.0.5 Sep 11,2025 17:36 UTC+0800 check pod status done connect unsupported engine type: minio minio-1.0.1 `kubectl get secrets -l app.kubernetes.io/instance=miniom-mgvllt` `kubectl get secrets miniom-mgvllt-minio-account-root -o jsonpath="***.data.username***"` `kubectl get secrets miniom-mgvllt-minio-account-root -o jsonpath="***.data.password***"` `kubectl get secrets miniom-mgvllt-minio-account-root -o jsonpath="***.data.port***"` DB_USERNAME:root;DB_PASSWORD:G42872E8FE33MXzg;DB_PORT:9000;DB_DATABASE: check pod miniom-mgvllt-minio-0 container_name minio exist password G42872E8FE33MXzg check pod miniom-mgvllt-minio-0 container_name kbagent exist password G42872E8FE33MXzg No container logs contain secret password. check component definition set component name:milvus set component version set component version:milvus set service versions:2.5.13,v2.3.2 set service versions sorted:v2.3.2,2.5.13 set milvus component definition set milvus component definition milvus-standalone-1.0.1 REPORT_COUNT 0:0 set replicas first:1,v2.3.2|1,2.5.13 set replicas third:1,v2.3.2 set replicas fourth:1,v2.3.2 set minimum cmpv service version set minimum cmpv service version replicas:1,v2.3.2 REPORT_COUNT:1 CLUSTER_TOPOLOGY: set milvus component definition set milvus component definition milvus-indexnode-1.0.1 LIMIT_CPU:0.5 LIMIT_MEMORY:0.5 storage size: 5 CLUSTER_NAME:milvus-mgvllt No resources found in ns-niaem namespace. pod_info: termination_policy:DoNotTerminate create 1 replica DoNotTerminate milvus cluster check component definition set component definition by component version check cmpd by labels check cmpd by compDefs set component definition: milvus-standalone-1.0.1 by component version:milvus apiVersion: apps.kubeblocks.io/v1 kind: Cluster metadata: name: milvus-mgvllt namespace: ns-niaem spec: clusterDef: milvus topology: standalone terminationPolicy: DoNotTerminate services: - name: proxy serviceName: proxy componentSelector: milvus spec: type: ClusterIP ports: - name: milvus port: 19530 protocol: TCP targetPort: milvus componentSpecs: - name: milvus serviceVersion: v2.3.2 disableExporter: true replicas: 1 resources: requests: cpu: 500m memory: 0.5Gi limits: cpu: 500m memory: 0.5Gi volumeClaimTemplates: - name: data spec: storageClassName: accessModes: - ReadWriteOnce resources: requests: storage: 5Gi - name: etcd serviceVersion: 3.5.15 replicas: 1 disableExporter: true resources: limits: cpu: 500m memory: 0.5Gi requests: cpu: 500m memory: 0.5Gi volumeClaimTemplates: - name: data spec: storageClassName: accessModes: - ReadWriteOnce resources: requests: storage: 5Gi - name: minio serviceVersion: 8.0.17 replicas: 1 disableExporter: true resources: limits: cpu: 500m memory: 0.5Gi requests: cpu: 500m memory: 0.5Gi volumeClaimTemplates: - name: data spec: storageClassName: accessModes: - ReadWriteOnce resources: requests: storage: 5Gi `kubectl apply -f test_create_milvus-mgvllt.yaml` cluster.apps.kubeblocks.io/milvus-mgvllt created apply test_create_milvus-mgvllt.yaml Success `rm -rf test_create_milvus-mgvllt.yaml` check cluster status `kbcli cluster list milvus-mgvllt --show-labels --namespace ns-niaem ` NAME NAMESPACE CLUSTER-DEFINITION TERMINATION-POLICY STATUS CREATED-TIME LABELS milvus-mgvllt ns-niaem milvus DoNotTerminate Creating Sep 11,2025 17:37 UTC+0800 clusterdefinition.kubeblocks.io/name=milvus cluster_status:Creating cluster_status:Creating cluster_status:Creating cluster_status:Creating cluster_status:Creating cluster_status:Creating cluster_status:Creating cluster_status:Creating cluster_status:Creating cluster_status:Creating cluster_status:Creating cluster_status:Updating cluster_status:Updating cluster_status:Updating cluster_status:Updating cluster_status:Updating check cluster status done cluster_status:Running check pod status `kbcli cluster list-instances milvus-mgvllt --namespace ns-niaem ` NAME NAMESPACE CLUSTER COMPONENT STATUS ROLE ACCESSMODE AZ CPU(REQUEST/LIMIT) MEMORY(REQUEST/LIMIT) STORAGE NODE CREATED-TIME milvus-mgvllt-etcd-0 ns-niaem milvus-mgvllt etcd Running leader 0 500m / 500m 512Mi / 512Mi data:5Gi aks-cicdamdpool-12089392-vmss000003/10.224.0.8 Sep 11,2025 17:37 UTC+0800 milvus-mgvllt-milvus-0 ns-niaem milvus-mgvllt milvus Running 0 500m / 500m 512Mi / 512Mi data:5Gi aks-cicdamdpool-12089392-vmss000001/10.224.0.7 Sep 11,2025 17:41 UTC+0800 milvus-mgvllt-minio-0 ns-niaem milvus-mgvllt minio Running 0 500m / 500m 512Mi / 512Mi data:5Gi aks-cicdamdpool-12089392-vmss000001/10.224.0.7 Sep 11,2025 17:37 UTC+0800 check pod status done connect unsupported engine type: milvus milvus-standalone-1.0.1 `kubectl get secrets -l app.kubernetes.io/instance=milvus-mgvllt` `kubectl get secrets milvus-mgvllt-minio-account-admin -o jsonpath="***.data.username***"` `kubectl get secrets milvus-mgvllt-minio-account-admin -o jsonpath="***.data.password***"` `kubectl get secrets milvus-mgvllt-minio-account-admin -o jsonpath="***.data.port***"` DB_USERNAME:admin;DB_PASSWORD:scl0RA4b8pXc1omg;DB_PORT:19530;DB_DATABASE: check pod milvus-mgvllt-milvus-0 container_name milvus exist password scl0RA4b8pXc1omg No container logs contain secret password. describe cluster `kbcli cluster describe milvus-mgvllt --namespace ns-niaem ` Name: milvus-mgvllt Created Time: Sep 11,2025 17:37 UTC+0800 NAMESPACE CLUSTER-DEFINITION TOPOLOGY STATUS TERMINATION-POLICY ns-niaem milvus standalone Running DoNotTerminate Endpoints: COMPONENT INTERNAL EXTERNAL Topology: COMPONENT SERVICE-VERSION INSTANCE ROLE STATUS AZ NODE CREATED-TIME etcd 3.5.15 milvus-mgvllt-etcd-0 leader Running 0 aks-cicdamdpool-12089392-vmss000003/10.224.0.8 Sep 11,2025 17:37 UTC+0800 milvus v2.3.2 milvus-mgvllt-milvus-0 Running 0 aks-cicdamdpool-12089392-vmss000001/10.224.0.7 Sep 11,2025 17:41 UTC+0800 minio 8.0.17 milvus-mgvllt-minio-0 Running 0 aks-cicdamdpool-12089392-vmss000001/10.224.0.7 Sep 11,2025 17:37 UTC+0800 Resources Allocation: COMPONENT INSTANCE-TEMPLATE CPU(REQUEST/LIMIT) MEMORY(REQUEST/LIMIT) STORAGE-SIZE STORAGE-CLASS etcd 500m / 500m 512Mi / 512Mi data:5Gi default minio 500m / 500m 512Mi / 512Mi data:5Gi default milvus 500m / 500m 512Mi / 512Mi data:5Gi default Images: COMPONENT COMPONENT-DEFINITION IMAGE etcd etcd-3-1.0.1 docker.io/apecloud/etcd:v3.5.15 minio milvus-minio-1.0.1 docker.io/apecloud/minio:RELEASE.2022-03-17T06-34-49Z milvus milvus-standalone-1.0.1 docker.io/apecloud/milvus:v2.3.2 Data Protection: BACKUP-REPO AUTO-BACKUP BACKUP-SCHEDULE BACKUP-METHOD BACKUP-RETENTION RECOVERABLE-TIME Show cluster events: kbcli cluster list-events -n ns-niaem milvus-mgvllt `kbcli cluster label milvus-mgvllt app.kubernetes.io/instance- --namespace ns-niaem ` label "app.kubernetes.io/instance" not found. `kbcli cluster label milvus-mgvllt app.kubernetes.io/instance=milvus-mgvllt --namespace ns-niaem ` `kbcli cluster label milvus-mgvllt --list --namespace ns-niaem ` NAME NAMESPACE LABELS milvus-mgvllt ns-niaem app.kubernetes.io/instance=milvus-mgvllt clusterdefinition.kubeblocks.io/name=milvus label cluster app.kubernetes.io/instance=milvus-mgvllt Success `kbcli cluster label case.name=kbcli.test1 -l app.kubernetes.io/instance=milvus-mgvllt --namespace ns-niaem ` `kbcli cluster label milvus-mgvllt --list --namespace ns-niaem ` NAME NAMESPACE LABELS milvus-mgvllt ns-niaem app.kubernetes.io/instance=milvus-mgvllt case.name=kbcli.test1 clusterdefinition.kubeblocks.io/name=milvus label cluster case.name=kbcli.test1 Success `kbcli cluster label milvus-mgvllt case.name=kbcli.test2 --overwrite --namespace ns-niaem ` `kbcli cluster label milvus-mgvllt --list --namespace ns-niaem ` NAME NAMESPACE LABELS milvus-mgvllt ns-niaem app.kubernetes.io/instance=milvus-mgvllt case.name=kbcli.test2 clusterdefinition.kubeblocks.io/name=milvus label cluster case.name=kbcli.test2 Success `kbcli cluster label milvus-mgvllt case.name- --namespace ns-niaem ` `kbcli cluster label milvus-mgvllt --list --namespace ns-niaem ` NAME NAMESPACE LABELS milvus-mgvllt ns-niaem app.kubernetes.io/instance=milvus-mgvllt clusterdefinition.kubeblocks.io/name=milvus delete cluster label case.name Success cluster connect insert batch data by db client Error from server (NotFound): pods "test-db-client-executionloop-milvus-mgvllt" not found `kubectl patch -p '***"metadata":***"finalizers":[]***' --type=merge pods test-db-client-executionloop-milvus-mgvllt --namespace ns-niaem ` Error from server (NotFound): pods "test-db-client-executionloop-milvus-mgvllt" not found Warning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely. Error from server (NotFound): pods "test-db-client-executionloop-milvus-mgvllt" not found `kubectl get secrets -l app.kubernetes.io/instance=milvus-mgvllt` `kubectl get secrets milvus-mgvllt-minio-account-admin -o jsonpath="***.data.username***"` `kubectl get secrets milvus-mgvllt-minio-account-admin -o jsonpath="***.data.password***"` `kubectl get secrets milvus-mgvllt-minio-account-admin -o jsonpath="***.data.port***"` DB_USERNAME:admin;DB_PASSWORD:scl0RA4b8pXc1omg;DB_PORT:19530;DB_DATABASE: No resources found in ns-niaem namespace. apiVersion: v1 kind: Pod metadata: name: test-db-client-executionloop-milvus-mgvllt namespace: ns-niaem spec: containers: - name: test-dbclient imagePullPolicy: IfNotPresent image: docker.io/apecloud/dbclient:test args: - "--host" - "milvus-mgvllt-proxy.ns-niaem.svc.cluster.local" - "--user" - "admin" - "--password" - "scl0RA4b8pXc1omg" - "--port" - "19530" - "--dbtype" - "milvus" - "--test" - "executionloop" - "--duration" - "60" - "--interval" - "1" restartPolicy: Never `kubectl apply -f test-db-client-executionloop-milvus-mgvllt.yaml` pod/test-db-client-executionloop-milvus-mgvllt created apply test-db-client-executionloop-milvus-mgvllt.yaml Success `rm -rf test-db-client-executionloop-milvus-mgvllt.yaml` check pod status pod_status:NAME READY STATUS RESTARTS AGE test-db-client-executionloop-milvus-mgvllt 1/1 Running 0 5s pod_status:NAME READY STATUS RESTARTS AGE test-db-client-executionloop-milvus-mgvllt 1/1 Running 0 9s pod_status:NAME READY STATUS RESTARTS AGE test-db-client-executionloop-milvus-mgvllt 1/1 Running 0 15s pod_status:NAME READY STATUS RESTARTS AGE test-db-client-executionloop-milvus-mgvllt 1/1 Running 0 20s pod_status:NAME READY STATUS RESTARTS AGE test-db-client-executionloop-milvus-mgvllt 1/1 Running 0 25s pod_status:NAME READY STATUS RESTARTS AGE test-db-client-executionloop-milvus-mgvllt 1/1 Running 0 30s pod_status:NAME READY STATUS RESTARTS AGE test-db-client-executionloop-milvus-mgvllt 1/1 Running 0 35s pod_status:NAME READY STATUS RESTARTS AGE test-db-client-executionloop-milvus-mgvllt 1/1 Running 0 40s pod_status:NAME READY STATUS RESTARTS AGE test-db-client-executionloop-milvus-mgvllt 1/1 Running 0 45s pod_status:NAME READY STATUS RESTARTS AGE test-db-client-executionloop-milvus-mgvllt 1/1 Running 0 51s pod_status:NAME READY STATUS RESTARTS AGE test-db-client-executionloop-milvus-mgvllt 1/1 Running 0 56s pod_status:NAME READY STATUS RESTARTS AGE test-db-client-executionloop-milvus-mgvllt 1/1 Running 0 61s check pod test-db-client-executionloop-milvus-mgvllt status done pod_status:NAME READY STATUS RESTARTS AGE test-db-client-executionloop-milvus-mgvllt 0/1 Completed 0 66s check cluster status `kbcli cluster list milvus-mgvllt --show-labels --namespace ns-niaem ` NAME NAMESPACE CLUSTER-DEFINITION TERMINATION-POLICY STATUS CREATED-TIME LABELS milvus-mgvllt ns-niaem milvus DoNotTerminate Running Sep 11,2025 17:37 UTC+0800 app.kubernetes.io/instance=milvus-mgvllt,clusterdefinition.kubeblocks.io/name=milvus check cluster status done cluster_status:Running check pod status `kbcli cluster list-instances milvus-mgvllt --namespace ns-niaem ` NAME NAMESPACE CLUSTER COMPONENT STATUS ROLE ACCESSMODE AZ CPU(REQUEST/LIMIT) MEMORY(REQUEST/LIMIT) STORAGE NODE CREATED-TIME milvus-mgvllt-etcd-0 ns-niaem milvus-mgvllt etcd Running leader 0 500m / 500m 512Mi / 512Mi data:5Gi aks-cicdamdpool-12089392-vmss000003/10.224.0.8 Sep 11,2025 17:37 UTC+0800 milvus-mgvllt-milvus-0 ns-niaem milvus-mgvllt milvus Running 0 500m / 500m 512Mi / 512Mi data:5Gi aks-cicdamdpool-12089392-vmss000001/10.224.0.7 Sep 11,2025 17:41 UTC+0800 milvus-mgvllt-minio-0 ns-niaem milvus-mgvllt minio Running 0 500m / 500m 512Mi / 512Mi data:5Gi aks-cicdamdpool-12089392-vmss000001/10.224.0.7 Sep 11,2025 17:37 UTC+0800 check pod status done connect unsupported engine type: milvus milvus-standalone-1.0.1 --host milvus-mgvllt-proxy.ns-niaem.svc.cluster.local --user admin --password scl0RA4b8pXc1omg --port 19530 --dbtype milvus --test executionloop --duration 60 --interval 1 SLF4J(I): Connected with provider of type [ch.qos.logback.classic.spi.LogbackServiceProvider] Execution loop start: Collection executions_loop_collection does not exist. Creating collection... Collection executions_loop_collection created successfully. Execution loop start: insert:executions_loop_collection:10::1:executions_loop_1 [ 1s ] executions total: 216 successful: 216 failed: 0 disconnect: 0 [ 2s ] executions total: 544 successful: 544 failed: 0 disconnect: 0 [ 3s ] executions total: 875 successful: 875 failed: 0 disconnect: 0 [ 4s ] executions total: 1232 successful: 1232 failed: 0 disconnect: 0 [ 5s ] executions total: 1576 successful: 1576 failed: 0 disconnect: 0 [ 6s ] executions total: 1914 successful: 1914 failed: 0 disconnect: 0 [ 7s ] executions total: 2250 successful: 2250 failed: 0 disconnect: 0 [ 8s ] executions total: 2588 successful: 2588 failed: 0 disconnect: 0 [ 9s ] executions total: 2917 successful: 2917 failed: 0 disconnect: 0 [ 10s ] executions total: 3276 successful: 3276 failed: 0 disconnect: 0 [ 11s ] executions total: 3631 successful: 3631 failed: 0 disconnect: 0 [ 12s ] executions total: 3978 successful: 3978 failed: 0 disconnect: 0 [ 13s ] executions total: 4317 successful: 4317 failed: 0 disconnect: 0 [ 14s ] executions total: 4663 successful: 4663 failed: 0 disconnect: 0 [ 15s ] executions total: 4989 successful: 4989 failed: 0 disconnect: 0 [ 16s ] executions total: 5345 successful: 5345 failed: 0 disconnect: 0 [ 17s ] executions total: 5697 successful: 5697 failed: 0 disconnect: 0 [ 18s ] executions total: 6062 successful: 6062 failed: 0 disconnect: 0 [ 19s ] executions total: 6393 successful: 6393 failed: 0 disconnect: 0 [ 20s ] executions total: 6751 successful: 6751 failed: 0 disconnect: 0 [ 21s ] executions total: 7099 successful: 7099 failed: 0 disconnect: 0 [ 22s ] executions total: 7455 successful: 7455 failed: 0 disconnect: 0 [ 23s ] executions total: 7805 successful: 7805 failed: 0 disconnect: 0 [ 24s ] executions total: 8148 successful: 8148 failed: 0 disconnect: 0 [ 25s ] executions total: 8501 successful: 8501 failed: 0 disconnect: 0 [ 26s ] executions total: 8849 successful: 8849 failed: 0 disconnect: 0 [ 27s ] executions total: 9210 successful: 9210 failed: 0 disconnect: 0 [ 28s ] executions total: 9552 successful: 9552 failed: 0 disconnect: 0 [ 29s ] executions total: 9897 successful: 9897 failed: 0 disconnect: 0 [ 30s ] executions total: 10247 successful: 10247 failed: 0 disconnect: 0 [ 31s ] executions total: 10592 successful: 10592 failed: 0 disconnect: 0 [ 32s ] executions total: 10921 successful: 10921 failed: 0 disconnect: 0 [ 33s ] executions total: 11266 successful: 11266 failed: 0 disconnect: 0 [ 34s ] executions total: 11616 successful: 11616 failed: 0 disconnect: 0 [ 35s ] executions total: 11979 successful: 11979 failed: 0 disconnect: 0 [ 36s ] executions total: 12319 successful: 12319 failed: 0 disconnect: 0 [ 37s ] executions total: 12664 successful: 12664 failed: 0 disconnect: 0 [ 38s ] executions total: 13019 successful: 13019 failed: 0 disconnect: 0 [ 39s ] executions total: 13351 successful: 13351 failed: 0 disconnect: 0 [ 40s ] executions total: 13696 successful: 13696 failed: 0 disconnect: 0 [ 41s ] executions total: 14038 successful: 14038 failed: 0 disconnect: 0 [ 42s ] executions total: 14385 successful: 14385 failed: 0 disconnect: 0 [ 43s ] executions total: 14729 successful: 14729 failed: 0 disconnect: 0 [ 44s ] executions total: 15101 successful: 15101 failed: 0 disconnect: 0 [ 45s ] executions total: 15443 successful: 15443 failed: 0 disconnect: 0 [ 46s ] executions total: 15772 successful: 15772 failed: 0 disconnect: 0 [ 47s ] executions total: 16102 successful: 16102 failed: 0 disconnect: 0 [ 48s ] executions total: 16452 successful: 16452 failed: 0 disconnect: 0 [ 49s ] executions total: 16799 successful: 16799 failed: 0 disconnect: 0 [ 50s ] executions total: 17135 successful: 17135 failed: 0 disconnect: 0 [ 51s ] executions total: 17474 successful: 17474 failed: 0 disconnect: 0 [ 52s ] executions total: 17816 successful: 17816 failed: 0 disconnect: 0 [ 53s ] executions total: 18173 successful: 18173 failed: 0 disconnect: 0 [ 54s ] executions total: 18526 successful: 18526 failed: 0 disconnect: 0 [ 55s ] executions total: 18858 successful: 18858 failed: 0 disconnect: 0 [ 56s ] executions total: 19200 successful: 19200 failed: 0 disconnect: 0 [ 57s ] executions total: 19572 successful: 19572 failed: 0 disconnect: 0 [ 58s ] executions total: 19901 successful: 19901 failed: 0 disconnect: 0 [ 59s ] executions total: 20247 successful: 20247 failed: 0 disconnect: 0 [ 60s ] executions total: 20564 successful: 20564 failed: 0 disconnect: 0 Test Result: Total Executions: 20564 Successful Executions: 20564 Failed Executions: 0 Disconnection Counts: 0 Connection Information: Database Type: milvus Host: milvus-mgvllt-proxy.ns-niaem.svc.cluster.local Port: 19530 Database: Table: User: admin Org: Access Mode: mysql Test Type: executionloop Query: Duration: 60 seconds Interval: 1 seconds DB_CLIENT_BATCH_DATA_COUNT: 20564 `kubectl patch -p '***"metadata":***"finalizers":[]***' --type=merge pods test-db-client-executionloop-milvus-mgvllt --namespace ns-niaem ` pod/test-db-client-executionloop-milvus-mgvllt patched (no change) Warning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely. pod "test-db-client-executionloop-milvus-mgvllt" force deleted test failover connectionstress check cluster status before cluster-failover-connectionstress check cluster status done cluster_status:Running Error from server (NotFound): pods "test-db-client-connectionstress-milvus-mgvllt" not found `kubectl patch -p '***"metadata":***"finalizers":[]***' --type=merge pods test-db-client-connectionstress-milvus-mgvllt --namespace ns-niaem ` Error from server (NotFound): pods "test-db-client-connectionstress-milvus-mgvllt" not found Warning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely. Error from server (NotFound): pods "test-db-client-connectionstress-milvus-mgvllt" not found `kubectl get secrets -l app.kubernetes.io/instance=milvus-mgvllt` `kubectl get secrets milvus-mgvllt-minio-account-admin -o jsonpath="***.data.username***"` `kubectl get secrets milvus-mgvllt-minio-account-admin -o jsonpath="***.data.password***"` `kubectl get secrets milvus-mgvllt-minio-account-admin -o jsonpath="***.data.port***"` DB_USERNAME:admin;DB_PASSWORD:scl0RA4b8pXc1omg;DB_PORT:19530;DB_DATABASE: No resources found in ns-niaem namespace. apiVersion: v1 kind: Pod metadata: name: test-db-client-connectionstress-milvus-mgvllt namespace: ns-niaem spec: containers: - name: test-dbclient imagePullPolicy: IfNotPresent image: docker.io/apecloud/dbclient:test args: - "--host" - "milvus-mgvllt-proxy.ns-niaem.svc.cluster.local" - "--user" - "admin" - "--password" - "scl0RA4b8pXc1omg" - "--port" - "19530" - "--database" - "" - "--dbtype" - "milvus" - "--test" - "connectionstress" - "--connections" - "4096" - "--duration" - "60" restartPolicy: Never `kubectl apply -f test-db-client-connectionstress-milvus-mgvllt.yaml` pod/test-db-client-connectionstress-milvus-mgvllt created apply test-db-client-connectionstress-milvus-mgvllt.yaml Success `rm -rf test-db-client-connectionstress-milvus-mgvllt.yaml` check pod status pod_status:NAME READY STATUS RESTARTS AGE test-db-client-connectionstress-milvus-mgvllt 1/1 Running 0 6s pod_status:NAME READY STATUS RESTARTS AGE test-db-client-connectionstress-milvus-mgvllt 1/1 Running 0 10s check pod test-db-client-connectionstress-milvus-mgvllt status done pod_status:NAME READY STATUS RESTARTS AGE test-db-client-connectionstress-milvus-mgvllt 0/1 Completed 0 15s check cluster status `kbcli cluster list milvus-mgvllt --show-labels --namespace ns-niaem ` NAME NAMESPACE CLUSTER-DEFINITION TERMINATION-POLICY STATUS CREATED-TIME LABELS milvus-mgvllt ns-niaem milvus DoNotTerminate Updating Sep 11,2025 17:37 UTC+0800 app.kubernetes.io/instance=milvus-mgvllt,clusterdefinition.kubeblocks.io/name=milvus cluster_status:Updating cluster_status:Updating cluster_status:Updating cluster_status:Updating cluster_status:Updating cluster_status:Updating cluster_status:Updating cluster_status:Updating cluster_status:Updating cluster_status:Updating cluster_status:Updating cluster_status:Updating cluster_status:Updating cluster_status:Updating cluster_status:Updating cluster_status:Updating check cluster status done cluster_status:Running check pod status `kbcli cluster list-instances milvus-mgvllt --namespace ns-niaem ` NAME NAMESPACE CLUSTER COMPONENT STATUS ROLE ACCESSMODE AZ CPU(REQUEST/LIMIT) MEMORY(REQUEST/LIMIT) STORAGE NODE CREATED-TIME milvus-mgvllt-etcd-0 ns-niaem milvus-mgvllt etcd Running leader 0 500m / 500m 512Mi / 512Mi data:5Gi aks-cicdamdpool-12089392-vmss000003/10.224.0.8 Sep 11,2025 17:37 UTC+0800 milvus-mgvllt-milvus-0 ns-niaem milvus-mgvllt milvus Running 0 500m / 500m 512Mi / 512Mi data:5Gi aks-cicdamdpool-12089392-vmss000001/10.224.0.7 Sep 11,2025 17:41 UTC+0800 milvus-mgvllt-minio-0 ns-niaem milvus-mgvllt minio Running 0 500m / 500m 512Mi / 512Mi data:5Gi aks-cicdamdpool-12089392-vmss000001/10.224.0.7 Sep 11,2025 17:37 UTC+0800 check pod status done connect unsupported engine type: milvus milvus-standalone-1.0.1 --host milvus-mgvllt-proxy.ns-niaem.svc.cluster.local --user admin --password scl0RA4b8pXc1omg --port 19530 --database --dbtype milvus --test connectionstress --connections 4096 --duration 60 SLF4J(I): Connected with provider of type [ch.qos.logback.classic.spi.LogbackServiceProvider] Test execution failed: UNAVAILABLE: Network closed for unknown reason io.grpc.StatusRuntimeException: UNAVAILABLE: Network closed for unknown reason at io.grpc.stub.ClientCalls.toStatusRuntimeException(ClientCalls.java:268) at io.grpc.stub.ClientCalls.getUnchecked(ClientCalls.java:249) at io.grpc.stub.ClientCalls.blockingUnaryCall(ClientCalls.java:167) at io.milvus.grpc.MilvusServiceGrpc$MilvusServiceBlockingStub.connect(MilvusServiceGrpc.java:5113) at io.milvus.v2.client.MilvusClientV2.connect(MilvusClientV2.java:152) at io.milvus.v2.client.MilvusClientV2.connect(MilvusClientV2.java:106) at io.milvus.v2.client.MilvusClientV2.(MilvusClientV2.java:85) at com.apecloud.dbtester.tester.MilvusTester.connect(MilvusTester.java:48) at com.apecloud.dbtester.tester.MilvusTester.connectionStress(MilvusTester.java:221) at com.apecloud.dbtester.commons.TestExecutor.executeTest(TestExecutor.java:37) at OneClient.executeTest(OneClient.java:108) at OneClient.main(OneClient.java:40) `kubectl patch -p '***"metadata":***"finalizers":[]***' --type=merge pods test-db-client-connectionstress-milvus-mgvllt --namespace ns-niaem ` pod/test-db-client-connectionstress-milvus-mgvllt patched (no change) Warning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely. pod "test-db-client-connectionstress-milvus-mgvllt" force deleted check failover pod name failover pod name:milvus-mgvllt-milvus-0 failover connectionstress Success No resources found in ns-niaem namespace. check db_client batch data count `echo "curl -s -H 'Content-Type: application/json' -X POST http://milvus-mgvllt-proxy.ns-niaem.svc.cluster.local:19530/v1/vector/query -d '***\"collectionName\":\"executions_loop_collection\",\"filter\":\"id == 20564\",\"limit\":0,\"outputFields\":[\"id\"]***' " | kubectl exec -it milvus-mgvllt-milvus-0 --namespace ns-niaem -- bash` check db_client batch data Success check component etcd exists `kubectl get components -l app.kubernetes.io/instance=milvus-mgvllt,apps.kubeblocks.io/component-name=etcd --namespace ns-niaem | (grep "etcd" || true )` `kubectl get pvc -l app.kubernetes.io/instance=milvus-mgvllt,apps.kubeblocks.io/component-name=etcd,apps.kubeblocks.io/vct-name=data --namespace ns-niaem ` cluster volume-expand check cluster status before ops check cluster status done cluster_status:Running No resources found in milvus-mgvllt namespace. `kbcli cluster volume-expand milvus-mgvllt --auto-approve --force=true --components etcd --volume-claim-templates data --storage 10Gi --namespace ns-niaem ` OpsRequest milvus-mgvllt-volumeexpansion-ccbw4 created successfully, you can view the progress: kbcli cluster describe-ops milvus-mgvllt-volumeexpansion-ccbw4 -n ns-niaem check ops status `kbcli cluster list-ops milvus-mgvllt --status all --namespace ns-niaem ` NAME NAMESPACE TYPE CLUSTER COMPONENT STATUS PROGRESS CREATED-TIME milvus-mgvllt-volumeexpansion-ccbw4 ns-niaem VolumeExpansion milvus-mgvllt etcd Running 0/1 Sep 11,2025 17:51 UTC+0800 check cluster status `kbcli cluster list milvus-mgvllt --show-labels --namespace ns-niaem ` NAME NAMESPACE CLUSTER-DEFINITION TERMINATION-POLICY STATUS CREATED-TIME LABELS milvus-mgvllt ns-niaem milvus DoNotTerminate Updating Sep 11,2025 17:37 UTC+0800 app.kubernetes.io/instance=milvus-mgvllt,clusterdefinition.kubeblocks.io/name=milvus cluster_status:Updating cluster_status:Updating cluster_status:Updating cluster_status:Updating cluster_status:Updating cluster_status:Updating cluster_status:Updating cluster_status:Updating cluster_status:Updating cluster_status:Updating cluster_status:Updating cluster_status:Updating cluster_status:Updating cluster_status:Updating cluster_status:Updating cluster_status:Updating cluster_status:Updating cluster_status:Updating cluster_status:Updating cluster_status:Updating cluster_status:Updating cluster_status:Updating cluster_status:Updating cluster_status:Updating cluster_status:Updating cluster_status:Updating cluster_status:Updating cluster_status:Updating cluster_status:Updating cluster_status:Updating cluster_status:Updating cluster_status:Updating cluster_status:Updating cluster_status:Updating cluster_status:Updating check cluster status done cluster_status:Running check pod status `kbcli cluster list-instances milvus-mgvllt --namespace ns-niaem ` NAME NAMESPACE CLUSTER COMPONENT STATUS ROLE ACCESSMODE AZ CPU(REQUEST/LIMIT) MEMORY(REQUEST/LIMIT) STORAGE NODE CREATED-TIME milvus-mgvllt-etcd-0 ns-niaem milvus-mgvllt etcd Running leader 0 500m / 500m 512Mi / 512Mi data:10Gi aks-cicdamdpool-12089392-vmss000003/10.224.0.8 Sep 11,2025 17:37 UTC+0800 milvus-mgvllt-milvus-0 ns-niaem milvus-mgvllt milvus Running 0 500m / 500m 512Mi / 512Mi data:5Gi aks-cicdamdpool-12089392-vmss000001/10.224.0.7 Sep 11,2025 17:41 UTC+0800 milvus-mgvllt-minio-0 ns-niaem milvus-mgvllt minio Running 0 500m / 500m 512Mi / 512Mi data:5Gi aks-cicdamdpool-12089392-vmss000001/10.224.0.7 Sep 11,2025 17:37 UTC+0800 check pod status done connect unsupported engine type: milvus milvus-standalone-1.0.1 No resources found in milvus-mgvllt namespace. check ops status `kbcli cluster list-ops milvus-mgvllt --status all --namespace ns-niaem ` NAME NAMESPACE TYPE CLUSTER COMPONENT STATUS PROGRESS CREATED-TIME milvus-mgvllt-volumeexpansion-ccbw4 ns-niaem VolumeExpansion milvus-mgvllt etcd Succeed 1/1 Sep 11,2025 17:51 UTC+0800 check ops status done ops_status:milvus-mgvllt-volumeexpansion-ccbw4 ns-niaem VolumeExpansion milvus-mgvllt etcd Succeed 1/1 Sep 11,2025 17:51 UTC+0800 `kubectl patch -p '***"metadata":***"finalizers":[]***' --type=merge opsrequests.operations milvus-mgvllt-volumeexpansion-ccbw4 --namespace ns-niaem ` opsrequest.operations.kubeblocks.io/milvus-mgvllt-volumeexpansion-ccbw4 patched `kbcli cluster delete-ops --name milvus-mgvllt-volumeexpansion-ccbw4 --force --auto-approve --namespace ns-niaem ` OpsRequest milvus-mgvllt-volumeexpansion-ccbw4 deleted No resources found in ns-niaem namespace. check db_client batch data count `echo "curl -s -H 'Content-Type: application/json' -X POST http://milvus-mgvllt-proxy.ns-niaem.svc.cluster.local:19530/v1/vector/query -d '***\"collectionName\":\"executions_loop_collection\",\"filter\":\"id == 20564\",\"limit\":0,\"outputFields\":[\"id\"]***' " | kubectl exec -it milvus-mgvllt-milvus-0 --namespace ns-niaem -- bash` check db_client batch data Success check component etcd exists `kubectl get components -l app.kubernetes.io/instance=milvus-mgvllt,apps.kubeblocks.io/component-name=etcd --namespace ns-niaem | (grep "etcd" || true )` check cluster status before ops check cluster status done cluster_status:Running `kbcli cluster vscale milvus-mgvllt --auto-approve --force=true --components etcd --cpu 600m --memory 0.6Gi --namespace ns-niaem ` OpsRequest milvus-mgvllt-verticalscaling-4pcpc created successfully, you can view the progress: kbcli cluster describe-ops milvus-mgvllt-verticalscaling-4pcpc -n ns-niaem check ops status `kbcli cluster list-ops milvus-mgvllt --status all --namespace ns-niaem ` NAME NAMESPACE TYPE CLUSTER COMPONENT STATUS PROGRESS CREATED-TIME milvus-mgvllt-verticalscaling-4pcpc ns-niaem VerticalScaling milvus-mgvllt etcd Creating -/- Sep 11,2025 18:06 UTC+0800 check cluster status `kbcli cluster list milvus-mgvllt --show-labels --namespace ns-niaem ` NAME NAMESPACE CLUSTER-DEFINITION TERMINATION-POLICY STATUS CREATED-TIME LABELS milvus-mgvllt ns-niaem milvus DoNotTerminate Updating Sep 11,2025 17:37 UTC+0800 app.kubernetes.io/instance=milvus-mgvllt,clusterdefinition.kubeblocks.io/name=milvus cluster_status:Updating cluster_status:Updating cluster_status:Updating cluster_status:Updating cluster_status:Updating cluster_status:Updating cluster_status:Updating cluster_status:Updating cluster_status:Updating cluster_status:Updating cluster_status:Updating cluster_status:Updating cluster_status:Updating check cluster status done cluster_status:Running check pod status `kbcli cluster list-instances milvus-mgvllt --namespace ns-niaem ` NAME NAMESPACE CLUSTER COMPONENT STATUS ROLE ACCESSMODE AZ CPU(REQUEST/LIMIT) MEMORY(REQUEST/LIMIT) STORAGE NODE CREATED-TIME milvus-mgvllt-etcd-0 ns-niaem milvus-mgvllt etcd Running leader 0 600m / 600m 644245094400m / 644245094400m data:10Gi aks-cicdamdpool-12089392-vmss000003/10.224.0.8 Sep 11,2025 18:06 UTC+0800 milvus-mgvllt-milvus-0 ns-niaem milvus-mgvllt milvus Running 0 500m / 500m 512Mi / 512Mi data:5Gi aks-cicdamdpool-12089392-vmss000001/10.224.0.7 Sep 11,2025 17:41 UTC+0800 milvus-mgvllt-minio-0 ns-niaem milvus-mgvllt minio Running 0 500m / 500m 512Mi / 512Mi data:5Gi aks-cicdamdpool-12089392-vmss000001/10.224.0.7 Sep 11,2025 17:37 UTC+0800 check pod status done connect unsupported engine type: milvus milvus-standalone-1.0.1 check ops status `kbcli cluster list-ops milvus-mgvllt --status all --namespace ns-niaem ` NAME NAMESPACE TYPE CLUSTER COMPONENT STATUS PROGRESS CREATED-TIME milvus-mgvllt-verticalscaling-4pcpc ns-niaem VerticalScaling milvus-mgvllt etcd Succeed 1/1 Sep 11,2025 18:06 UTC+0800 check ops status done ops_status:milvus-mgvllt-verticalscaling-4pcpc ns-niaem VerticalScaling milvus-mgvllt etcd Succeed 1/1 Sep 11,2025 18:06 UTC+0800 `kubectl patch -p '***"metadata":***"finalizers":[]***' --type=merge opsrequests.operations milvus-mgvllt-verticalscaling-4pcpc --namespace ns-niaem ` opsrequest.operations.kubeblocks.io/milvus-mgvllt-verticalscaling-4pcpc patched `kbcli cluster delete-ops --name milvus-mgvllt-verticalscaling-4pcpc --force --auto-approve --namespace ns-niaem ` OpsRequest milvus-mgvllt-verticalscaling-4pcpc deleted No resources found in ns-niaem namespace. check db_client batch data count `echo "curl -s -H 'Content-Type: application/json' -X POST http://milvus-mgvllt-proxy.ns-niaem.svc.cluster.local:19530/v1/vector/query -d '***\"collectionName\":\"executions_loop_collection\",\"filter\":\"id == 20564\",\"limit\":0,\"outputFields\":[\"id\"]***' " | kubectl exec -it milvus-mgvllt-milvus-0 --namespace ns-niaem -- bash` check db_client batch data Success `kubectl get pvc -l app.kubernetes.io/instance=milvus-mgvllt,apps.kubeblocks.io/component-name=milvus,apps.kubeblocks.io/vct-name=data --namespace ns-niaem ` cluster volume-expand check cluster status before ops check cluster status done cluster_status:Running No resources found in milvus-mgvllt namespace. `kbcli cluster volume-expand milvus-mgvllt --auto-approve --force=true --components milvus --volume-claim-templates data --storage 9Gi --namespace ns-niaem ` OpsRequest milvus-mgvllt-volumeexpansion-dfk8h created successfully, you can view the progress: kbcli cluster describe-ops milvus-mgvllt-volumeexpansion-dfk8h -n ns-niaem check ops status `kbcli cluster list-ops milvus-mgvllt --status all --namespace ns-niaem ` NAME NAMESPACE TYPE CLUSTER COMPONENT STATUS PROGRESS CREATED-TIME milvus-mgvllt-volumeexpansion-dfk8h ns-niaem VolumeExpansion milvus-mgvllt milvus Running -/- Sep 11,2025 18:07 UTC+0800 check cluster status `kbcli cluster list milvus-mgvllt --show-labels --namespace ns-niaem ` NAME NAMESPACE CLUSTER-DEFINITION TERMINATION-POLICY STATUS CREATED-TIME LABELS milvus-mgvllt ns-niaem milvus DoNotTerminate Updating Sep 11,2025 17:37 UTC+0800 app.kubernetes.io/instance=milvus-mgvllt,clusterdefinition.kubeblocks.io/name=milvus cluster_status:Updating cluster_status:Updating check cluster status done cluster_status:Running check pod status `kbcli cluster list-instances milvus-mgvllt --namespace ns-niaem ` NAME NAMESPACE CLUSTER COMPONENT STATUS ROLE ACCESSMODE AZ CPU(REQUEST/LIMIT) MEMORY(REQUEST/LIMIT) STORAGE NODE CREATED-TIME milvus-mgvllt-etcd-0 ns-niaem milvus-mgvllt etcd Running leader 0 600m / 600m 644245094400m / 644245094400m data:10Gi aks-cicdamdpool-12089392-vmss000003/10.224.0.8 Sep 11,2025 18:06 UTC+0800 milvus-mgvllt-milvus-0 ns-niaem milvus-mgvllt milvus Running 0 500m / 500m 512Mi / 512Mi data:9Gi aks-cicdamdpool-12089392-vmss000001/10.224.0.7 Sep 11,2025 17:41 UTC+0800 milvus-mgvllt-minio-0 ns-niaem milvus-mgvllt minio Running 0 500m / 500m 512Mi / 512Mi data:5Gi aks-cicdamdpool-12089392-vmss000001/10.224.0.7 Sep 11,2025 17:37 UTC+0800 check pod status done connect unsupported engine type: milvus milvus-standalone-1.0.1 No resources found in milvus-mgvllt namespace. check ops status `kbcli cluster list-ops milvus-mgvllt --status all --namespace ns-niaem ` NAME NAMESPACE TYPE CLUSTER COMPONENT STATUS PROGRESS CREATED-TIME milvus-mgvllt-volumeexpansion-dfk8h ns-niaem VolumeExpansion milvus-mgvllt milvus Succeed 1/1 Sep 11,2025 18:07 UTC+0800 check ops status done ops_status:milvus-mgvllt-volumeexpansion-dfk8h ns-niaem VolumeExpansion milvus-mgvllt milvus Succeed 1/1 Sep 11,2025 18:07 UTC+0800 `kubectl patch -p '***"metadata":***"finalizers":[]***' --type=merge opsrequests.operations milvus-mgvllt-volumeexpansion-dfk8h --namespace ns-niaem ` opsrequest.operations.kubeblocks.io/milvus-mgvllt-volumeexpansion-dfk8h patched `kbcli cluster delete-ops --name milvus-mgvllt-volumeexpansion-dfk8h --force --auto-approve --namespace ns-niaem ` OpsRequest milvus-mgvllt-volumeexpansion-dfk8h deleted No resources found in ns-niaem namespace. check db_client batch data count `echo "curl -s -H 'Content-Type: application/json' -X POST http://milvus-mgvllt-proxy.ns-niaem.svc.cluster.local:19530/v1/vector/query -d '***\"collectionName\":\"executions_loop_collection\",\"filter\":\"id == 20564\",\"limit\":0,\"outputFields\":[\"id\"]***' " | kubectl exec -it milvus-mgvllt-milvus-0 --namespace ns-niaem -- bash` check db_client batch data Success skip cluster HorizontalScaling Out cluster restart check cluster status before ops check cluster status done cluster_status:Running `kbcli cluster restart milvus-mgvllt --auto-approve --force=true --components milvus --namespace ns-niaem ` OpsRequest milvus-mgvllt-restart-4hwfs created successfully, you can view the progress: kbcli cluster describe-ops milvus-mgvllt-restart-4hwfs -n ns-niaem check ops status `kbcli cluster list-ops milvus-mgvllt --status all --namespace ns-niaem ` NAME NAMESPACE TYPE CLUSTER COMPONENT STATUS PROGRESS CREATED-TIME milvus-mgvllt-restart-4hwfs ns-niaem Restart milvus-mgvllt milvus Running 0/1 Sep 11,2025 18:08 UTC+0800 check cluster status `kbcli cluster list milvus-mgvllt --show-labels --namespace ns-niaem ` NAME NAMESPACE CLUSTER-DEFINITION TERMINATION-POLICY STATUS CREATED-TIME LABELS milvus-mgvllt ns-niaem milvus DoNotTerminate Updating Sep 11,2025 17:37 UTC+0800 app.kubernetes.io/instance=milvus-mgvllt,clusterdefinition.kubeblocks.io/name=milvus cluster_status:Updating cluster_status:Updating cluster_status:Updating cluster_status:Updating cluster_status:Updating cluster_status:Updating cluster_status:Updating cluster_status:Updating cluster_status:Updating cluster_status:Updating cluster_status:Updating cluster_status:Updating cluster_status:Updating cluster_status:Updating cluster_status:Updating cluster_status:Updating cluster_status:Updating cluster_status:Updating cluster_status:Updating cluster_status:Updating cluster_status:Updating cluster_status:Updating cluster_status:Updating cluster_status:Updating cluster_status:Updating cluster_status:Updating cluster_status:Updating cluster_status:Updating cluster_status:Updating check cluster status done cluster_status:Running check pod status `kbcli cluster list-instances milvus-mgvllt --namespace ns-niaem ` NAME NAMESPACE CLUSTER COMPONENT STATUS ROLE ACCESSMODE AZ CPU(REQUEST/LIMIT) MEMORY(REQUEST/LIMIT) STORAGE NODE CREATED-TIME milvus-mgvllt-etcd-0 ns-niaem milvus-mgvllt etcd Running leader 0 600m / 600m 644245094400m / 644245094400m data:10Gi aks-cicdamdpool-12089392-vmss000003/10.224.0.8 Sep 11,2025 18:06 UTC+0800 milvus-mgvllt-milvus-0 ns-niaem milvus-mgvllt milvus Running 0 500m / 500m 512Mi / 512Mi data:9Gi aks-cicdamdpool-12089392-vmss000003/10.224.0.8 Sep 11,2025 18:08 UTC+0800 milvus-mgvllt-minio-0 ns-niaem milvus-mgvllt minio Running 0 500m / 500m 512Mi / 512Mi data:5Gi aks-cicdamdpool-12089392-vmss000001/10.224.0.7 Sep 11,2025 17:37 UTC+0800 check pod status done connect unsupported engine type: milvus milvus-standalone-1.0.1 check ops status `kbcli cluster list-ops milvus-mgvllt --status all --namespace ns-niaem ` NAME NAMESPACE TYPE CLUSTER COMPONENT STATUS PROGRESS CREATED-TIME milvus-mgvllt-restart-4hwfs ns-niaem Restart milvus-mgvllt milvus Succeed 1/1 Sep 11,2025 18:08 UTC+0800 check ops status done ops_status:milvus-mgvllt-restart-4hwfs ns-niaem Restart milvus-mgvllt milvus Succeed 1/1 Sep 11,2025 18:08 UTC+0800 `kubectl patch -p '***"metadata":***"finalizers":[]***' --type=merge opsrequests.operations milvus-mgvllt-restart-4hwfs --namespace ns-niaem ` opsrequest.operations.kubeblocks.io/milvus-mgvllt-restart-4hwfs patched `kbcli cluster delete-ops --name milvus-mgvllt-restart-4hwfs --force --auto-approve --namespace ns-niaem ` OpsRequest milvus-mgvllt-restart-4hwfs deleted No resources found in ns-niaem namespace. check db_client batch data count `echo "curl -s -H 'Content-Type: application/json' -X POST http://milvus-mgvllt-proxy.ns-niaem.svc.cluster.local:19530/v1/vector/query -d '***\"collectionName\":\"executions_loop_collection\",\"filter\":\"id == 20564\",\"limit\":0,\"outputFields\":[\"id\"]***' " | kubectl exec -it milvus-mgvllt-milvus-0 --namespace ns-niaem -- bash` check db_client batch data Success check cluster status before ops check cluster status done cluster_status:Running `kbcli cluster vscale milvus-mgvllt --auto-approve --force=true --components milvus --cpu 600m --memory 0.6Gi --namespace ns-niaem ` OpsRequest milvus-mgvllt-verticalscaling-9zb7t created successfully, you can view the progress: kbcli cluster describe-ops milvus-mgvllt-verticalscaling-9zb7t -n ns-niaem check ops status `kbcli cluster list-ops milvus-mgvllt --status all --namespace ns-niaem ` NAME NAMESPACE TYPE CLUSTER COMPONENT STATUS PROGRESS CREATED-TIME milvus-mgvllt-verticalscaling-9zb7t ns-niaem VerticalScaling milvus-mgvllt milvus Running 0/1 Sep 11,2025 18:11 UTC+0800 check cluster status `kbcli cluster list milvus-mgvllt --show-labels --namespace ns-niaem ` NAME NAMESPACE CLUSTER-DEFINITION TERMINATION-POLICY STATUS CREATED-TIME LABELS milvus-mgvllt ns-niaem milvus DoNotTerminate Updating Sep 11,2025 17:37 UTC+0800 app.kubernetes.io/instance=milvus-mgvllt,clusterdefinition.kubeblocks.io/name=milvus cluster_status:Updating cluster_status:Updating cluster_status:Updating cluster_status:Updating cluster_status:Updating cluster_status:Updating cluster_status:Updating cluster_status:Updating cluster_status:Updating cluster_status:Updating cluster_status:Updating cluster_status:Updating cluster_status:Updating cluster_status:Updating cluster_status:Updating cluster_status:Updating check cluster status done cluster_status:Running check pod status `kbcli cluster list-instances milvus-mgvllt --namespace ns-niaem ` NAME NAMESPACE CLUSTER COMPONENT STATUS ROLE ACCESSMODE AZ CPU(REQUEST/LIMIT) MEMORY(REQUEST/LIMIT) STORAGE NODE CREATED-TIME milvus-mgvllt-etcd-0 ns-niaem milvus-mgvllt etcd Running leader 0 600m / 600m 644245094400m / 644245094400m data:10Gi aks-cicdamdpool-12089392-vmss000003/10.224.0.8 Sep 11,2025 18:06 UTC+0800 milvus-mgvllt-milvus-0 ns-niaem milvus-mgvllt milvus Running 0 600m / 600m 644245094400m / 644245094400m data:9Gi aks-cicdamdpool-12089392-vmss000003/10.224.0.8 Sep 11,2025 18:11 UTC+0800 milvus-mgvllt-minio-0 ns-niaem milvus-mgvllt minio Running 0 500m / 500m 512Mi / 512Mi data:5Gi aks-cicdamdpool-12089392-vmss000001/10.224.0.7 Sep 11,2025 17:37 UTC+0800 check pod status done connect unsupported engine type: milvus milvus-standalone-1.0.1 check ops status `kbcli cluster list-ops milvus-mgvllt --status all --namespace ns-niaem ` NAME NAMESPACE TYPE CLUSTER COMPONENT STATUS PROGRESS CREATED-TIME milvus-mgvllt-verticalscaling-9zb7t ns-niaem VerticalScaling milvus-mgvllt milvus Succeed 1/1 Sep 11,2025 18:11 UTC+0800 check ops status done ops_status:milvus-mgvllt-verticalscaling-9zb7t ns-niaem VerticalScaling milvus-mgvllt milvus Succeed 1/1 Sep 11,2025 18:11 UTC+0800 `kubectl patch -p '***"metadata":***"finalizers":[]***' --type=merge opsrequests.operations milvus-mgvllt-verticalscaling-9zb7t --namespace ns-niaem ` opsrequest.operations.kubeblocks.io/milvus-mgvllt-verticalscaling-9zb7t patched `kbcli cluster delete-ops --name milvus-mgvllt-verticalscaling-9zb7t --force --auto-approve --namespace ns-niaem ` OpsRequest milvus-mgvllt-verticalscaling-9zb7t deleted No resources found in ns-niaem namespace. check db_client batch data count `echo "curl -s -H 'Content-Type: application/json' -X POST http://milvus-mgvllt-proxy.ns-niaem.svc.cluster.local:19530/v1/vector/query -d '***\"collectionName\":\"executions_loop_collection\",\"filter\":\"id == 20564\",\"limit\":0,\"outputFields\":[\"id\"]***' " | kubectl exec -it milvus-mgvllt-milvus-0 --namespace ns-niaem -- bash` check db_client batch data Success check component minio exists `kubectl get components -l app.kubernetes.io/instance=milvus-mgvllt,apps.kubeblocks.io/component-name=minio --namespace ns-niaem | (grep "minio" || true )` `kubectl get pvc -l app.kubernetes.io/instance=milvus-mgvllt,apps.kubeblocks.io/component-name=minio,apps.kubeblocks.io/vct-name=data --namespace ns-niaem ` cluster volume-expand check cluster status before ops check cluster status done cluster_status:Running No resources found in milvus-mgvllt namespace. `kbcli cluster volume-expand milvus-mgvllt --auto-approve --force=true --components minio --volume-claim-templates data --storage 10Gi --namespace ns-niaem ` OpsRequest milvus-mgvllt-volumeexpansion-6j7bh created successfully, you can view the progress: kbcli cluster describe-ops milvus-mgvllt-volumeexpansion-6j7bh -n ns-niaem check ops status `kbcli cluster list-ops milvus-mgvllt --status all --namespace ns-niaem ` NAME NAMESPACE TYPE CLUSTER COMPONENT STATUS PROGRESS CREATED-TIME milvus-mgvllt-volumeexpansion-6j7bh ns-niaem VolumeExpansion milvus-mgvllt minio Running 0/1 Sep 11,2025 18:12 UTC+0800 check cluster status `kbcli cluster list milvus-mgvllt --show-labels --namespace ns-niaem ` NAME NAMESPACE CLUSTER-DEFINITION TERMINATION-POLICY STATUS CREATED-TIME LABELS milvus-mgvllt ns-niaem milvus DoNotTerminate Updating Sep 11,2025 17:37 UTC+0800 app.kubernetes.io/instance=milvus-mgvllt,clusterdefinition.kubeblocks.io/name=milvus cluster_status:Updating cluster_status:Updating cluster_status:Updating cluster_status:Updating cluster_status:Updating cluster_status:Updating cluster_status:Updating cluster_status:Updating cluster_status:Updating cluster_status:Updating cluster_status:Updating cluster_status:Updating cluster_status:Updating cluster_status:Updating cluster_status:Updating check cluster status done cluster_status:Running check pod status `kbcli cluster list-instances milvus-mgvllt --namespace ns-niaem ` NAME NAMESPACE CLUSTER COMPONENT STATUS ROLE ACCESSMODE AZ CPU(REQUEST/LIMIT) MEMORY(REQUEST/LIMIT) STORAGE NODE CREATED-TIME milvus-mgvllt-etcd-0 ns-niaem milvus-mgvllt etcd Running leader 0 600m / 600m 644245094400m / 644245094400m data:10Gi aks-cicdamdpool-12089392-vmss000003/10.224.0.8 Sep 11,2025 18:06 UTC+0800 milvus-mgvllt-milvus-0 ns-niaem milvus-mgvllt milvus Running 0 600m / 600m 644245094400m / 644245094400m data:9Gi aks-cicdamdpool-12089392-vmss000003/10.224.0.8 Sep 11,2025 18:11 UTC+0800 milvus-mgvllt-minio-0 ns-niaem milvus-mgvllt minio Running 0 500m / 500m 512Mi / 512Mi data:10Gi aks-cicdamdpool-12089392-vmss000001/10.224.0.7 Sep 11,2025 17:37 UTC+0800 check pod status done connect unsupported engine type: milvus milvus-standalone-1.0.1 No resources found in milvus-mgvllt namespace. check ops status `kbcli cluster list-ops milvus-mgvllt --status all --namespace ns-niaem ` NAME NAMESPACE TYPE CLUSTER COMPONENT STATUS PROGRESS CREATED-TIME milvus-mgvllt-volumeexpansion-6j7bh ns-niaem VolumeExpansion milvus-mgvllt minio Succeed 1/1 Sep 11,2025 18:12 UTC+0800 check ops status done ops_status:milvus-mgvllt-volumeexpansion-6j7bh ns-niaem VolumeExpansion milvus-mgvllt minio Succeed 1/1 Sep 11,2025 18:12 UTC+0800 `kubectl patch -p '***"metadata":***"finalizers":[]***' --type=merge opsrequests.operations milvus-mgvllt-volumeexpansion-6j7bh --namespace ns-niaem ` opsrequest.operations.kubeblocks.io/milvus-mgvllt-volumeexpansion-6j7bh patched `kbcli cluster delete-ops --name milvus-mgvllt-volumeexpansion-6j7bh --force --auto-approve --namespace ns-niaem ` OpsRequest milvus-mgvllt-volumeexpansion-6j7bh deleted No resources found in ns-niaem namespace. check db_client batch data count `echo "curl -s -H 'Content-Type: application/json' -X POST http://milvus-mgvllt-proxy.ns-niaem.svc.cluster.local:19530/v1/vector/query -d '***\"collectionName\":\"executions_loop_collection\",\"filter\":\"id == 20564\",\"limit\":0,\"outputFields\":[\"id\"]***' " | kubectl exec -it milvus-mgvllt-milvus-0 --namespace ns-niaem -- bash` check db_client batch data Success 1 cluster stop check cluster status before ops check cluster status done cluster_status:Running `kbcli cluster stop milvus-mgvllt --auto-approve --force=true --namespace ns-niaem ` OpsRequest milvus-mgvllt-stop-dxfxt created successfully, you can view the progress: kbcli cluster describe-ops milvus-mgvllt-stop-dxfxt -n ns-niaem check ops status `kbcli cluster list-ops milvus-mgvllt --status all --namespace ns-niaem ` NAME NAMESPACE TYPE CLUSTER COMPONENT STATUS PROGRESS CREATED-TIME milvus-mgvllt-stop-dxfxt ns-niaem Stop milvus-mgvllt etcd,milvus,minio Running 0/3 Sep 11,2025 18:19 UTC+0800 check cluster status `kbcli cluster list milvus-mgvllt --show-labels --namespace ns-niaem ` NAME NAMESPACE CLUSTER-DEFINITION TERMINATION-POLICY STATUS CREATED-TIME LABELS milvus-mgvllt ns-niaem milvus DoNotTerminate Stopping Sep 11,2025 17:37 UTC+0800 app.kubernetes.io/instance=milvus-mgvllt,clusterdefinition.kubeblocks.io/name=milvus cluster_status:Stopping cluster_status:Stopping cluster_status:Stopping cluster_status:Stopping cluster_status:Stopping check cluster status done cluster_status:Stopped check pod status `kbcli cluster list-instances milvus-mgvllt --namespace ns-niaem ` NAME NAMESPACE CLUSTER COMPONENT STATUS ROLE ACCESSMODE AZ CPU(REQUEST/LIMIT) MEMORY(REQUEST/LIMIT) STORAGE NODE CREATED-TIME check pod status done check ops status `kbcli cluster list-ops milvus-mgvllt --status all --namespace ns-niaem ` NAME NAMESPACE TYPE CLUSTER COMPONENT STATUS PROGRESS CREATED-TIME milvus-mgvllt-stop-dxfxt ns-niaem Stop milvus-mgvllt etcd,milvus,minio Succeed 3/3 Sep 11,2025 18:19 UTC+0800 check ops status done ops_status:milvus-mgvllt-stop-dxfxt ns-niaem Stop milvus-mgvllt etcd,milvus,minio Succeed 3/3 Sep 11,2025 18:19 UTC+0800 `kubectl patch -p '***"metadata":***"finalizers":[]***' --type=merge opsrequests.operations milvus-mgvllt-stop-dxfxt --namespace ns-niaem ` opsrequest.operations.kubeblocks.io/milvus-mgvllt-stop-dxfxt patched `kbcli cluster delete-ops --name milvus-mgvllt-stop-dxfxt --force --auto-approve --namespace ns-niaem ` OpsRequest milvus-mgvllt-stop-dxfxt deleted cluster start check cluster status before ops check cluster status done cluster_status:Stopped `kbcli cluster start milvus-mgvllt --force=true --namespace ns-niaem ` OpsRequest milvus-mgvllt-start-qcpj9 created successfully, you can view the progress: kbcli cluster describe-ops milvus-mgvllt-start-qcpj9 -n ns-niaem check ops status `kbcli cluster list-ops milvus-mgvllt --status all --namespace ns-niaem ` NAME NAMESPACE TYPE CLUSTER COMPONENT STATUS PROGRESS CREATED-TIME milvus-mgvllt-start-qcpj9 ns-niaem Start milvus-mgvllt Running -/- Sep 11,2025 18:19 UTC+0800 check cluster status `kbcli cluster list milvus-mgvllt --show-labels --namespace ns-niaem ` NAME NAMESPACE CLUSTER-DEFINITION TERMINATION-POLICY STATUS CREATED-TIME LABELS milvus-mgvllt ns-niaem milvus DoNotTerminate Updating Sep 11,2025 17:37 UTC+0800 app.kubernetes.io/instance=milvus-mgvllt,clusterdefinition.kubeblocks.io/name=milvus cluster_status:Updating cluster_status:Updating cluster_status:Updating cluster_status:Updating check cluster status done cluster_status:Running check pod status `kbcli cluster list-instances milvus-mgvllt --namespace ns-niaem ` NAME NAMESPACE CLUSTER COMPONENT STATUS ROLE ACCESSMODE AZ CPU(REQUEST/LIMIT) MEMORY(REQUEST/LIMIT) STORAGE NODE CREATED-TIME milvus-mgvllt-etcd-0 ns-niaem milvus-mgvllt etcd Running leader 0 600m / 600m 644245094400m / 644245094400m data:10Gi aks-cicdamdpool-12089392-vmss000000/10.224.0.5 Sep 11,2025 18:19 UTC+0800 milvus-mgvllt-milvus-0 ns-niaem milvus-mgvllt milvus Running 0 600m / 600m 644245094400m / 644245094400m data:9Gi aks-cicdamdpool-12089392-vmss000003/10.224.0.8 Sep 11,2025 18:19 UTC+0800 milvus-mgvllt-minio-0 ns-niaem milvus-mgvllt minio Running 0 500m / 500m 512Mi / 512Mi data:10Gi aks-cicdamdpool-12089392-vmss000003/10.224.0.8 Sep 11,2025 18:19 UTC+0800 check pod status done connect unsupported engine type: milvus milvus-standalone-1.0.1 check ops status `kbcli cluster list-ops milvus-mgvllt --status all --namespace ns-niaem ` NAME NAMESPACE TYPE CLUSTER COMPONENT STATUS PROGRESS CREATED-TIME milvus-mgvllt-start-qcpj9 ns-niaem Start milvus-mgvllt etcd,milvus,minio Succeed 3/3 Sep 11,2025 18:19 UTC+0800 check ops status done ops_status:milvus-mgvllt-start-qcpj9 ns-niaem Start milvus-mgvllt etcd,milvus,minio Succeed 3/3 Sep 11,2025 18:19 UTC+0800 `kubectl patch -p '***"metadata":***"finalizers":[]***' --type=merge opsrequests.operations milvus-mgvllt-start-qcpj9 --namespace ns-niaem ` opsrequest.operations.kubeblocks.io/milvus-mgvllt-start-qcpj9 patched `kbcli cluster delete-ops --name milvus-mgvllt-start-qcpj9 --force --auto-approve --namespace ns-niaem ` OpsRequest milvus-mgvllt-start-qcpj9 deleted No resources found in ns-niaem namespace. check db_client batch data count `echo "curl -s -H 'Content-Type: application/json' -X POST http://milvus-mgvllt-proxy.ns-niaem.svc.cluster.local:19530/v1/vector/query -d '***\"collectionName\":\"executions_loop_collection\",\"filter\":\"id == 20564\",\"limit\":0,\"outputFields\":[\"id\"]***' " | kubectl exec -it milvus-mgvllt-milvus-0 --namespace ns-niaem -- bash` check db_client batch data Success check component minio exists `kubectl get components -l app.kubernetes.io/instance=milvus-mgvllt,apps.kubeblocks.io/component-name=minio --namespace ns-niaem | (grep "minio" || true )` check cluster status before ops check cluster status done cluster_status:Running `kbcli cluster vscale milvus-mgvllt --auto-approve --force=true --components minio --cpu 600m --memory 0.6Gi --namespace ns-niaem ` OpsRequest milvus-mgvllt-verticalscaling-79dg9 created successfully, you can view the progress: kbcli cluster describe-ops milvus-mgvllt-verticalscaling-79dg9 -n ns-niaem check ops status `kbcli cluster list-ops milvus-mgvllt --status all --namespace ns-niaem ` NAME NAMESPACE TYPE CLUSTER COMPONENT STATUS PROGRESS CREATED-TIME milvus-mgvllt-verticalscaling-79dg9 ns-niaem VerticalScaling milvus-mgvllt minio Running 0/1 Sep 11,2025 18:21 UTC+0800 check cluster status `kbcli cluster list milvus-mgvllt --show-labels --namespace ns-niaem ` NAME NAMESPACE CLUSTER-DEFINITION TERMINATION-POLICY STATUS CREATED-TIME LABELS milvus-mgvllt ns-niaem milvus DoNotTerminate Updating Sep 11,2025 17:37 UTC+0800 app.kubernetes.io/instance=milvus-mgvllt,clusterdefinition.kubeblocks.io/name=milvus cluster_status:Updating cluster_status:Updating cluster_status:Updating cluster_status:Updating cluster_status:Updating cluster_status:Updating cluster_status:Updating cluster_status:Updating check cluster status done cluster_status:Running check pod status `kbcli cluster list-instances milvus-mgvllt --namespace ns-niaem ` NAME NAMESPACE CLUSTER COMPONENT STATUS ROLE ACCESSMODE AZ CPU(REQUEST/LIMIT) MEMORY(REQUEST/LIMIT) STORAGE NODE CREATED-TIME milvus-mgvllt-etcd-0 ns-niaem milvus-mgvllt etcd Running leader 0 600m / 600m 644245094400m / 644245094400m data:10Gi aks-cicdamdpool-12089392-vmss000000/10.224.0.5 Sep 11,2025 18:19 UTC+0800 milvus-mgvllt-milvus-0 ns-niaem milvus-mgvllt milvus Running 0 600m / 600m 644245094400m / 644245094400m data:9Gi aks-cicdamdpool-12089392-vmss000003/10.224.0.8 Sep 11,2025 18:19 UTC+0800 milvus-mgvllt-minio-0 ns-niaem milvus-mgvllt minio Running 0 600m / 600m 644245094400m / 644245094400m data:10Gi aks-cicdamdpool-12089392-vmss000001/10.224.0.7 Sep 11,2025 18:21 UTC+0800 check pod status done connect unsupported engine type: milvus milvus-standalone-1.0.1 check ops status `kbcli cluster list-ops milvus-mgvllt --status all --namespace ns-niaem ` NAME NAMESPACE TYPE CLUSTER COMPONENT STATUS PROGRESS CREATED-TIME milvus-mgvllt-verticalscaling-79dg9 ns-niaem VerticalScaling milvus-mgvllt minio Succeed 1/1 Sep 11,2025 18:21 UTC+0800 check ops status done ops_status:milvus-mgvllt-verticalscaling-79dg9 ns-niaem VerticalScaling milvus-mgvllt minio Succeed 1/1 Sep 11,2025 18:21 UTC+0800 `kubectl patch -p '***"metadata":***"finalizers":[]***' --type=merge opsrequests.operations milvus-mgvllt-verticalscaling-79dg9 --namespace ns-niaem ` opsrequest.operations.kubeblocks.io/milvus-mgvllt-verticalscaling-79dg9 patched `kbcli cluster delete-ops --name milvus-mgvllt-verticalscaling-79dg9 --force --auto-approve --namespace ns-niaem ` OpsRequest milvus-mgvllt-verticalscaling-79dg9 deleted No resources found in ns-niaem namespace. check db_client batch data count `echo "curl -s -H 'Content-Type: application/json' -X POST http://milvus-mgvllt-proxy.ns-niaem.svc.cluster.local:19530/v1/vector/query -d '***\"collectionName\":\"executions_loop_collection\",\"filter\":\"id == 20564\",\"limit\":0,\"outputFields\":[\"id\"]***' " | kubectl exec -it milvus-mgvllt-milvus-0 --namespace ns-niaem -- bash` check db_client batch data Success cluster restart check cluster status before ops check cluster status done cluster_status:Running `kbcli cluster restart milvus-mgvllt --auto-approve --force=true --namespace ns-niaem ` OpsRequest milvus-mgvllt-restart-n57k2 created successfully, you can view the progress: kbcli cluster describe-ops milvus-mgvllt-restart-n57k2 -n ns-niaem check ops status `kbcli cluster list-ops milvus-mgvllt --status all --namespace ns-niaem ` NAME NAMESPACE TYPE CLUSTER COMPONENT STATUS PROGRESS CREATED-TIME milvus-mgvllt-restart-n57k2 ns-niaem Restart milvus-mgvllt etcd,minio,milvus Running -/- Sep 11,2025 18:22 UTC+0800 check cluster status `kbcli cluster list milvus-mgvllt --show-labels --namespace ns-niaem ` NAME NAMESPACE CLUSTER-DEFINITION TERMINATION-POLICY STATUS CREATED-TIME LABELS milvus-mgvllt ns-niaem milvus DoNotTerminate Updating Sep 11,2025 17:37 UTC+0800 app.kubernetes.io/instance=milvus-mgvllt,clusterdefinition.kubeblocks.io/name=milvus cluster_status:Updating cluster_status:Updating cluster_status:Updating cluster_status:Updating cluster_status:Updating cluster_status:Updating cluster_status:Updating cluster_status:Updating cluster_status:Updating cluster_status:Updating cluster_status:Updating cluster_status:Updating cluster_status:Updating cluster_status:Updating cluster_status:Updating cluster_status:Updating cluster_status:Updating cluster_status:Updating cluster_status:Updating cluster_status:Updating cluster_status:Updating cluster_status:Updating cluster_status:Updating check cluster status done cluster_status:Running check pod status `kbcli cluster list-instances milvus-mgvllt --namespace ns-niaem ` NAME NAMESPACE CLUSTER COMPONENT STATUS ROLE ACCESSMODE AZ CPU(REQUEST/LIMIT) MEMORY(REQUEST/LIMIT) STORAGE NODE CREATED-TIME milvus-mgvllt-etcd-0 ns-niaem milvus-mgvllt etcd Running leader 0 600m / 600m 644245094400m / 644245094400m data:10Gi aks-cicdamdpool-12089392-vmss000001/10.224.0.7 Sep 11,2025 18:22 UTC+0800 milvus-mgvllt-milvus-0 ns-niaem milvus-mgvllt milvus Running 0 600m / 600m 644245094400m / 644245094400m data:9Gi aks-cicdamdpool-12089392-vmss000003/10.224.0.8 Sep 11,2025 18:23 UTC+0800 milvus-mgvllt-minio-0 ns-niaem milvus-mgvllt minio Running 0 600m / 600m 644245094400m / 644245094400m data:10Gi aks-cicdamdpool-12089392-vmss000000/10.224.0.5 Sep 11,2025 18:22 UTC+0800 check pod status done connect unsupported engine type: milvus milvus-standalone-1.0.1 check ops status `kbcli cluster list-ops milvus-mgvllt --status all --namespace ns-niaem ` NAME NAMESPACE TYPE CLUSTER COMPONENT STATUS PROGRESS CREATED-TIME milvus-mgvllt-restart-n57k2 ns-niaem Restart milvus-mgvllt etcd,minio,milvus Succeed 3/3 Sep 11,2025 18:22 UTC+0800 check ops status done ops_status:milvus-mgvllt-restart-n57k2 ns-niaem Restart milvus-mgvllt etcd,minio,milvus Succeed 3/3 Sep 11,2025 18:22 UTC+0800 `kubectl patch -p '***"metadata":***"finalizers":[]***' --type=merge opsrequests.operations milvus-mgvllt-restart-n57k2 --namespace ns-niaem ` opsrequest.operations.kubeblocks.io/milvus-mgvllt-restart-n57k2 patched `kbcli cluster delete-ops --name milvus-mgvllt-restart-n57k2 --force --auto-approve --namespace ns-niaem ` OpsRequest milvus-mgvllt-restart-n57k2 deleted No resources found in ns-niaem namespace. check db_client batch data count `echo "curl -s -H 'Content-Type: application/json' -X POST http://milvus-mgvllt-proxy.ns-niaem.svc.cluster.local:19530/v1/vector/query -d '***\"collectionName\":\"executions_loop_collection\",\"filter\":\"id == 20564\",\"limit\":0,\"outputFields\":[\"id\"]***' " | kubectl exec -it milvus-mgvllt-milvus-0 --namespace ns-niaem -- bash` check db_client batch data Success cluster update terminationPolicy WipeOut `kbcli cluster update milvus-mgvllt --termination-policy=WipeOut --namespace ns-niaem ` cluster.apps.kubeblocks.io/milvus-mgvllt updated check cluster status `kbcli cluster list milvus-mgvllt --show-labels --namespace ns-niaem ` NAME NAMESPACE CLUSTER-DEFINITION TERMINATION-POLICY STATUS CREATED-TIME LABELS milvus-mgvllt ns-niaem milvus WipeOut Running Sep 11,2025 17:37 UTC+0800 app.kubernetes.io/instance=milvus-mgvllt,clusterdefinition.kubeblocks.io/name=milvus check cluster status done cluster_status:Running check pod status `kbcli cluster list-instances milvus-mgvllt --namespace ns-niaem ` NAME NAMESPACE CLUSTER COMPONENT STATUS ROLE ACCESSMODE AZ CPU(REQUEST/LIMIT) MEMORY(REQUEST/LIMIT) STORAGE NODE CREATED-TIME milvus-mgvllt-etcd-0 ns-niaem milvus-mgvllt etcd Running leader 0 600m / 600m 644245094400m / 644245094400m data:10Gi aks-cicdamdpool-12089392-vmss000001/10.224.0.7 Sep 11,2025 18:22 UTC+0800 milvus-mgvllt-milvus-0 ns-niaem milvus-mgvllt milvus Running 0 600m / 600m 644245094400m / 644245094400m data:9Gi aks-cicdamdpool-12089392-vmss000003/10.224.0.8 Sep 11,2025 18:23 UTC+0800 milvus-mgvllt-minio-0 ns-niaem milvus-mgvllt minio Running 0 600m / 600m 644245094400m / 644245094400m data:10Gi aks-cicdamdpool-12089392-vmss000000/10.224.0.5 Sep 11,2025 18:22 UTC+0800 check pod status done connect unsupported engine type: milvus milvus-standalone-1.0.1 cluster list-logs `kbcli cluster list-logs milvus-mgvllt --namespace ns-niaem ` No log files found. Error from server (NotFound): pods "milvus-mgvllt-milvus-0" not found cluster logs `kbcli cluster logs milvus-mgvllt --tail 30 --namespace ns-niaem ` Defaulted container "etcd" out of: etcd, kbagent, inject-bash (init), init-kbagent (init), kbagent-worker (init) ***"level":"warn","ts":"2025-09-11T10:23:26.396501Z","caller":"etcdserver/v3_server.go:920","msg":"waiting for ReadIndex response took too long, retrying","sent-request-id":17092736427843027306,"retry-timeout":"500ms"*** ***"level":"warn","ts":"2025-09-11T10:23:26.789386Z","caller":"v3rpc/interceptor.go:197","msg":"request stats","start time":"2025-09-11T10:23:19.788631Z","time spent":"7.000747299s","remote":"10.244.2.115:58512","response type":"/etcdserverpb.Lease/LeaseGrant","request count":-1,"request size":-1,"response count":-1,"response size":-1,"request content":""*** ***"level":"warn","ts":"2025-09-11T10:23:26.897358Z","caller":"etcdserver/v3_server.go:920","msg":"waiting for ReadIndex response took too long, retrying","sent-request-id":17092736427843027306,"retry-timeout":"500ms"*** ***"level":"warn","ts":"2025-09-11T10:23:27.389207Z","caller":"etcdserver/v3_server.go:932","msg":"timed out waiting for read index response (local node might have slow network)","timeout":"7s"*** ***"level":"warn","ts":"2025-09-11T10:23:27.389399Z","caller":"etcdserver/util.go:170","msg":"apply request took too long","took":"7.000313565s","expected-duration":"100ms","prefix":"read-only range ","request":"key:\"milvus-mgvllt/meta/session/rootcoord\" range_end:\"milvus-mgvllt/meta/session/rootcoore\" ","response":"","error":"etcdserver: request timed out"*** ***"level":"info","ts":"2025-09-11T10:23:27.389475Z","caller":"traceutil/trace.go:171","msg":"trace[198354813] range","detail":"***range_begin:milvus-mgvllt/meta/session/rootcoord; range_end:milvus-mgvllt/meta/session/rootcoore; ***","duration":"7.000429166s","start":"2025-09-11T10:23:20.389020Z","end":"2025-09-11T10:23:27.389449Z","steps":["trace[198354813] 'agreement among raft nodes before linearized reading' (duration: 7.000309765s)"],"step_count":1*** ***"level":"warn","ts":"2025-09-11T10:23:27.389526Z","caller":"v3rpc/interceptor.go:197","msg":"request stats","start time":"2025-09-11T10:23:20.388985Z","time spent":"7.000527566s","remote":"10.244.2.115:58522","response type":"/etcdserverpb.KV/Range","request count":0,"request size":76,"response count":0,"response size":0,"request content":"key:\"milvus-mgvllt/meta/session/rootcoord\" range_end:\"milvus-mgvllt/meta/session/rootcoore\" "*** ***"level":"warn","ts":"2025-09-11T10:23:27.890044Z","caller":"etcdserver/v3_server.go:920","msg":"waiting for ReadIndex response took too long, retrying","sent-request-id":17092736427843027311,"retry-timeout":"500ms"*** ***"level":"warn","ts":"2025-09-11T10:23:28.390826Z","caller":"etcdserver/v3_server.go:920","msg":"waiting for ReadIndex response took too long, retrying","sent-request-id":17092736427843027311,"retry-timeout":"500ms"*** ***"level":"warn","ts":"2025-09-11T10:23:28.891917Z","caller":"etcdserver/v3_server.go:920","msg":"waiting for ReadIndex response took too long, retrying","sent-request-id":17092736427843027311,"retry-timeout":"500ms"*** ***"level":"warn","ts":"2025-09-11T10:23:29.392563Z","caller":"etcdserver/v3_server.go:920","msg":"waiting for ReadIndex response took too long, retrying","sent-request-id":17092736427843027311,"retry-timeout":"500ms"*** ***"level":"warn","ts":"2025-09-11T10:23:29.893707Z","caller":"etcdserver/v3_server.go:920","msg":"waiting for ReadIndex response took too long, retrying","sent-request-id":17092736427843027311,"retry-timeout":"500ms"*** ***"level":"warn","ts":"2025-09-11T10:23:33.573234Z","caller":"wal/wal.go:805","msg":"slow fdatasync","took":"3.658768565s","expected-duration":"1s"*** ***"level":"warn","ts":"2025-09-11T10:23:33.716633Z","caller":"etcdserver/util.go:170","msg":"apply request took too long","took":"204.002328ms","expected-duration":"100ms","prefix":"read-only range ","request":"key:\"milvus-mgvllt/config\" range_end:\"milvus-mgvllt/confih\" serializable:true ","response":"range_response_count:0 size:5"*** ***"level":"info","ts":"2025-09-11T10:23:33.716708Z","caller":"traceutil/trace.go:171","msg":"trace[310868131] range","detail":"***range_begin:milvus-mgvllt/config; range_end:milvus-mgvllt/confih; response_count:0; response_revision:838; ***","duration":"204.103129ms","start":"2025-09-11T10:23:33.512593Z","end":"2025-09-11T10:23:33.716696Z","steps":["trace[310868131] 'range keys from in-memory index tree' (duration: 203.983728ms)"],"step_count":1*** ***"level":"warn","ts":"2025-09-11T10:23:33.716858Z","caller":"etcdserver/util.go:170","msg":"apply request took too long","took":"143.365195ms","expected-duration":"100ms","prefix":"","request":"header: lease_grant:","response":"size:40"*** ***"level":"info","ts":"2025-09-11T10:23:33.716926Z","caller":"traceutil/trace.go:171","msg":"trace[33345604] linearizableReadLoop","detail":"***readStateIndex:1100; appliedIndex:1098; ***","duration":"6.327617268s","start":"2025-09-11T10:23:27.389300Z","end":"2025-09-11T10:23:33.716918Z","steps":["trace[33345604] 'read index received' (duration: 2.525147904s)","trace[33345604] 'applied index is now lower than readState.Index' (duration: 3.802468664s)"],"step_count":2*** ***"level":"warn","ts":"2025-09-11T10:23:33.716976Z","caller":"etcdserver/util.go:170","msg":"apply request took too long","took":"12.937459835s","expected-duration":"100ms","prefix":"read-only range ","request":"key:\"milvus-mgvllt/meta/session/rootcoord\" range_end:\"milvus-mgvllt/meta/session/rootcoore\" ","response":"range_response_count:1 size:282"*** ***"level":"warn","ts":"2025-09-11T10:23:33.716982Z","caller":"v3rpc/interceptor.go:197","msg":"request stats","start time":"2025-09-11T10:23:26.817814Z","time spent":"6.899161966s","remote":"10.244.2.115:58512","response type":"/etcdserverpb.Lease/LeaseGrant","request count":-1,"request size":-1,"response count":-1,"response size":-1,"request content":""*** ***"level":"info","ts":"2025-09-11T10:23:33.717012Z","caller":"traceutil/trace.go:171","msg":"trace[33072395] range","detail":"***range_begin:milvus-mgvllt/meta/session/rootcoord; range_end:milvus-mgvllt/meta/session/rootcoore; response_count:1; response_revision:838; ***","duration":"12.937501335s","start":"2025-09-11T10:23:20.779501Z","end":"2025-09-11T10:23:33.717003Z","steps":["trace[33072395] 'agreement among raft nodes before linearized reading' (duration: 12.937430234s)"],"step_count":1*** ***"level":"warn","ts":"2025-09-11T10:23:33.717020Z","caller":"etcdserver/util.go:170","msg":"apply request took too long","took":"13.069966225s","expected-duration":"100ms","prefix":"read-only range ","request":"key:\"milvus-mgvllt/meta/session/rootcoord\" range_end:\"milvus-mgvllt/meta/session/rootcoore\" ","response":"range_response_count:1 size:282"*** ***"level":"warn","ts":"2025-09-11T10:23:33.717035Z","caller":"v3rpc/interceptor.go:197","msg":"request stats","start time":"2025-09-11T10:23:20.779455Z","time spent":"12.937572735s","remote":"10.244.2.115:58560","response type":"/etcdserverpb.KV/Range","request count":0,"request size":76,"response count":1,"response size":305,"request content":"key:\"milvus-mgvllt/meta/session/rootcoord\" range_end:\"milvus-mgvllt/meta/session/rootcoore\" "*** ***"level":"warn","ts":"2025-09-11T10:23:33.717029Z","caller":"etcdserver/util.go:170","msg":"apply request took too long","took":"13.047976494s","expected-duration":"100ms","prefix":"read-only range ","request":"key:\"milvus-mgvllt/meta/session/rootcoord\" range_end:\"milvus-mgvllt/meta/session/rootcoore\" ","response":"range_response_count:1 size:282"*** ***"level":"warn","ts":"2025-09-11T10:23:33.717041Z","caller":"etcdserver/util.go:170","msg":"apply request took too long","took":"6.30120509s","expected-duration":"100ms","prefix":"read-only range ","request":"key:\"milvus-mgvllt/meta/session/rootcoord\" range_end:\"milvus-mgvllt/meta/session/rootcoore\" ","response":"range_response_count:1 size:282"*** ***"level":"info","ts":"2025-09-11T10:23:33.717078Z","caller":"traceutil/trace.go:171","msg":"trace[655978483] range","detail":"***range_begin:milvus-mgvllt/meta/session/rootcoord; range_end:milvus-mgvllt/meta/session/rootcoore; response_count:1; response_revision:838; ***","duration":"6.301241691s","start":"2025-09-11T10:23:27.415825Z","end":"2025-09-11T10:23:33.717067Z","steps":["trace[655978483] 'agreement among raft nodes before linearized reading' (duration: 6.30118889s)"],"step_count":1*** ***"level":"info","ts":"2025-09-11T10:23:33.717093Z","caller":"traceutil/trace.go:171","msg":"trace[1305418444] range","detail":"***range_begin:milvus-mgvllt/meta/session/rootcoord; range_end:milvus-mgvllt/meta/session/rootcoore; response_count:1; response_revision:838; ***","duration":"13.048021795s","start":"2025-09-11T10:23:20.669038Z","end":"2025-09-11T10:23:33.717060Z","steps":["trace[1305418444] 'agreement among raft nodes before linearized reading' (duration: 13.047936194s)"],"step_count":1*** ***"level":"warn","ts":"2025-09-11T10:23:33.717141Z","caller":"v3rpc/interceptor.go:197","msg":"request stats","start time":"2025-09-11T10:23:20.668999Z","time spent":"13.048133596s","remote":"10.244.2.115:58566","response type":"/etcdserverpb.KV/Range","request count":0,"request size":76,"response count":1,"response size":305,"request content":"key:\"milvus-mgvllt/meta/session/rootcoord\" range_end:\"milvus-mgvllt/meta/session/rootcoore\" "*** ***"level":"warn","ts":"2025-09-11T10:23:33.717106Z","caller":"v3rpc/interceptor.go:197","msg":"request stats","start time":"2025-09-11T10:23:27.415787Z","time spent":"6.301310992s","remote":"10.244.2.115:58522","response type":"/etcdserverpb.KV/Range","request count":0,"request size":76,"response count":1,"response size":305,"request content":"key:\"milvus-mgvllt/meta/session/rootcoord\" range_end:\"milvus-mgvllt/meta/session/rootcoore\" "*** ***"level":"info","ts":"2025-09-11T10:23:33.717049Z","caller":"traceutil/trace.go:171","msg":"trace[843501848] range","detail":"***range_begin:milvus-mgvllt/meta/session/rootcoord; range_end:milvus-mgvllt/meta/session/rootcoore; response_count:1; response_revision:838; ***","duration":"13.070000125s","start":"2025-09-11T10:23:20.647039Z","end":"2025-09-11T10:23:33.717039Z","steps":["trace[843501848] 'agreement among raft nodes before linearized reading' (duration: 13.069946125s)"],"step_count":1*** ***"level":"warn","ts":"2025-09-11T10:23:33.717237Z","caller":"v3rpc/interceptor.go:197","msg":"request stats","start time":"2025-09-11T10:23:20.646999Z","time spent":"13.070224327s","remote":"10.244.2.115:58538","response type":"/etcdserverpb.KV/Range","request count":0,"request size":76,"response count":1,"response size":305,"request content":"key:\"milvus-mgvllt/meta/session/rootcoord\" range_end:\"milvus-mgvllt/meta/session/rootcoore\" "*** delete cluster milvus-mgvllt `kbcli cluster delete milvus-mgvllt --auto-approve --namespace ns-niaem ` Cluster milvus-mgvllt deleted pod_info:milvus-mgvllt-etcd-0 2/2 Running 0 2m18s milvus-mgvllt-milvus-0 1/1 Running 0 108s milvus-mgvllt-minio-0 1/1 Running 0 2m17s pod_info:milvus-mgvllt-etcd-0 2/2 Running 0 2m38s milvus-mgvllt-milvus-0 1/1 Terminating 0 2m8s milvus-mgvllt-minio-0 1/1 Running 0 2m37s No resources found in ns-niaem namespace. delete cluster pod done No resources found in ns-niaem namespace. check cluster resource non-exist OK: pvc No resources found in ns-niaem namespace. delete cluster done No resources found in ns-niaem namespace. No resources found in ns-niaem namespace. No resources found in ns-niaem namespace. set milvus component definition set milvus component definition milvus-indexnode-1.0.1 LIMIT_CPU:0.5 LIMIT_MEMORY:0.5 storage size: 5 CLUSTER_NAME:milvus-mgvllt No resources found in ns-niaem namespace. pod_info: termination_policy:Delete create 1 replica Delete milvus cluster check component definition set component definition by component version check cmpd by labels check cmpd by compDefs set component definition: milvus-standalone-1.0.1 by component version:milvus apiVersion: apps.kubeblocks.io/v1 kind: Cluster metadata: name: milvus-mgvllt namespace: ns-niaem spec: clusterDef: milvus terminationPolicy: Delete topology: cluster componentSpecs: - serviceVersion: v2.3.2 name: proxy replicas: 1 disableExporter: true configs: - name: config variables: minio_bucket: test minio_root_path: kubeblocks-milvus-mgvllt minio_use_path_style: "true" mq_type: kafka resources: limits: cpu: 500m memory: 0.5Gi requests: cpu: 500m memory: 0.5Gi serviceRefs: - clusterServiceSelector: cluster: etcdm-mgvllt service: component: etcd port: client service: headless name: milvus-meta-storage namespace: ns-niaem - clusterServiceSelector: cluster: kafkam-mgvllt service: component: kafka-combine port: kafka-client service: headless name: milvus-log-storage namespace: ns-niaem - clusterServiceSelector: cluster: miniom-mgvllt credential: component: minio name: root service: component: minio port: api service: headless name: milvus-object-storage namespace: ns-niaem - serviceVersion: v2.3.2 name: mixcoord replicas: 1 disableExporter: true configs: - name: config variables: minio_bucket: test minio_root_path: kubeblocks-milvus-mgvllt minio_use_path_style: "true" mq_type: kafka resources: limits: cpu: 500m memory: 0.5Gi requests: cpu: 500m memory: 0.5Gi serviceRefs: - clusterServiceSelector: cluster: etcdm-mgvllt service: component: etcd port: client service: headless name: milvus-meta-storage namespace: ns-niaem - clusterServiceSelector: cluster: kafkam-mgvllt service: component: kafka-combine port: kafka-client service: headless name: milvus-log-storage namespace: ns-niaem - clusterServiceSelector: cluster: miniom-mgvllt credential: component: minio name: root service: component: minio port: api service: headless name: milvus-object-storage namespace: ns-niaem - serviceVersion: v2.3.2 name: datanode replicas: 1 disableExporter: true configs: - name: config variables: minio_bucket: test minio_root_path: kubeblocks-milvus-mgvllt minio_use_path_style: "true" mq_type: kafka resources: limits: cpu: 500m memory: 0.5Gi requests: cpu: 500m memory: 0.5Gi serviceRefs: - clusterServiceSelector: cluster: etcdm-mgvllt service: component: etcd port: client service: headless name: milvus-meta-storage namespace: ns-niaem - clusterServiceSelector: cluster: kafkam-mgvllt service: component: kafka-combine port: kafka-client service: headless name: milvus-log-storage namespace: ns-niaem - clusterServiceSelector: cluster: miniom-mgvllt credential: component: minio name: root service: component: minio port: api service: headless name: milvus-object-storage namespace: ns-niaem - serviceVersion: v2.3.2 name: indexnode replicas: 1 disableExporter: true configs: - name: config variables: minio_bucket: test minio_root_path: kubeblocks-milvus-mgvllt minio_use_path_style: "true" mq_type: kafka resources: limits: cpu: 500m memory: 0.5Gi requests: cpu: 500m memory: 0.5Gi serviceRefs: - clusterServiceSelector: cluster: etcdm-mgvllt service: component: etcd port: client service: headless name: milvus-meta-storage namespace: ns-niaem - clusterServiceSelector: cluster: kafkam-mgvllt service: component: kafka-combine port: kafka-client service: headless name: milvus-log-storage namespace: ns-niaem - clusterServiceSelector: cluster: miniom-mgvllt credential: component: minio name: root service: component: minio port: api service: headless name: milvus-object-storage namespace: ns-niaem - serviceVersion: v2.3.2 name: querynode replicas: 1 disableExporter: true configs: - name: config variables: minio_bucket: test minio_root_path: kubeblocks-milvus-mgvllt minio_use_path_style: "true" mq_type: kafka resources: limits: cpu: 500m memory: 0.5Gi requests: cpu: 500m memory: 0.5Gi serviceRefs: - clusterServiceSelector: cluster: etcdm-mgvllt service: component: etcd port: client service: headless name: milvus-meta-storage namespace: ns-niaem - clusterServiceSelector: cluster: kafkam-mgvllt service: component: kafka-combine port: kafka-client service: headless name: milvus-log-storage namespace: ns-niaem - clusterServiceSelector: cluster: miniom-mgvllt credential: component: minio name: root service: component: minio port: api service: headless name: milvus-object-storage namespace: ns-niaem `kubectl apply -f test_create_milvus-mgvllt.yaml` cluster.apps.kubeblocks.io/milvus-mgvllt created apply test_create_milvus-mgvllt.yaml Success `rm -rf test_create_milvus-mgvllt.yaml` check cluster status `kbcli cluster list milvus-mgvllt --show-labels --namespace ns-niaem ` NAME NAMESPACE CLUSTER-DEFINITION TERMINATION-POLICY STATUS CREATED-TIME LABELS milvus-mgvllt ns-niaem milvus Delete Creating Sep 11,2025 18:25 UTC+0800 clusterdefinition.kubeblocks.io/name=milvus cluster_status:Creating cluster_status:Creating check cluster status done cluster_status:Running check pod status `kbcli cluster list-instances milvus-mgvllt --namespace ns-niaem ` NAME NAMESPACE CLUSTER COMPONENT STATUS ROLE ACCESSMODE AZ CPU(REQUEST/LIMIT) MEMORY(REQUEST/LIMIT) STORAGE NODE CREATED-TIME milvus-mgvllt-datanode-0 ns-niaem milvus-mgvllt datanode Running 0 500m / 500m 512Mi / 512Mi aks-cicdamdpool-12089392-vmss000003/10.224.0.8 Sep 11,2025 18:25 UTC+0800 milvus-mgvllt-indexnode-0 ns-niaem milvus-mgvllt indexnode Running 0 500m / 500m 512Mi / 512Mi aks-cicdamdpool-12089392-vmss000001/10.224.0.7 Sep 11,2025 18:25 UTC+0800 milvus-mgvllt-mixcoord-0 ns-niaem milvus-mgvllt mixcoord Running 0 500m / 500m 512Mi / 512Mi aks-cicdamdpool-12089392-vmss000001/10.224.0.7 Sep 11,2025 18:25 UTC+0800 milvus-mgvllt-proxy-0 ns-niaem milvus-mgvllt proxy Running 0 500m / 500m 512Mi / 512Mi aks-cicdamdpool-12089392-vmss000003/10.224.0.8 Sep 11,2025 18:25 UTC+0800 milvus-mgvllt-querynode-0 ns-niaem milvus-mgvllt querynode Running 0 500m / 500m 512Mi / 512Mi aks-cicdamdpool-12089392-vmss000003/10.224.0.8 Sep 11,2025 18:25 UTC+0800 check pod status done connect unsupported engine type: milvus milvus-standalone-1.0.1 `kubectl get secrets -l app.kubernetes.io/instance=milvus-mgvllt` No resources found in ns-niaem namespace. Not found cluster secret DB_USERNAME:admin;DB_PASSWORD:scl0RA4b8pXc1omg;DB_PORT:19530;DB_DATABASE: check pod milvus-mgvllt-proxy-0 container_name proxy exist password scl0RA4b8pXc1omg No container logs contain secret password. describe cluster `kbcli cluster describe milvus-mgvllt --namespace ns-niaem ` Name: milvus-mgvllt Created Time: Sep 11,2025 18:25 UTC+0800 NAMESPACE CLUSTER-DEFINITION TOPOLOGY STATUS TERMINATION-POLICY ns-niaem milvus cluster Running Delete Endpoints: COMPONENT INTERNAL EXTERNAL proxy milvus-mgvllt-proxy.ns-niaem.svc.cluster.local:19530 milvus-mgvllt-proxy.ns-niaem.svc.cluster.local:9091 Topology: COMPONENT SERVICE-VERSION INSTANCE ROLE STATUS AZ NODE CREATED-TIME datanode v2.3.2 milvus-mgvllt-datanode-0 Running 0 aks-cicdamdpool-12089392-vmss000003/10.224.0.8 Sep 11,2025 18:25 UTC+0800 indexnode v2.3.2 milvus-mgvllt-indexnode-0 Running 0 aks-cicdamdpool-12089392-vmss000001/10.224.0.7 Sep 11,2025 18:25 UTC+0800 mixcoord v2.3.2 milvus-mgvllt-mixcoord-0 Running 0 aks-cicdamdpool-12089392-vmss000001/10.224.0.7 Sep 11,2025 18:25 UTC+0800 proxy v2.3.2 milvus-mgvllt-proxy-0 Running 0 aks-cicdamdpool-12089392-vmss000003/10.224.0.8 Sep 11,2025 18:25 UTC+0800 querynode v2.3.2 milvus-mgvllt-querynode-0 Running 0 aks-cicdamdpool-12089392-vmss000003/10.224.0.8 Sep 11,2025 18:25 UTC+0800 Resources Allocation: COMPONENT INSTANCE-TEMPLATE CPU(REQUEST/LIMIT) MEMORY(REQUEST/LIMIT) STORAGE-SIZE STORAGE-CLASS proxy 500m / 500m 512Mi / 512Mi mixcoord 500m / 500m 512Mi / 512Mi datanode 500m / 500m 512Mi / 512Mi indexnode 500m / 500m 512Mi / 512Mi querynode 500m / 500m 512Mi / 512Mi Images: COMPONENT COMPONENT-DEFINITION IMAGE proxy milvus-proxy-1.0.1 docker.io/apecloud/milvus:v2.3.2 mixcoord milvus-mixcoord-1.0.1 docker.io/apecloud/milvus:v2.3.2 datanode milvus-datanode-1.0.1 docker.io/apecloud/milvus:v2.3.2 indexnode milvus-indexnode-1.0.1 docker.io/apecloud/milvus:v2.3.2 querynode milvus-querynode-1.0.1 docker.io/apecloud/milvus:v2.3.2 Show cluster events: kbcli cluster list-events -n ns-niaem milvus-mgvllt `kbcli cluster label milvus-mgvllt app.kubernetes.io/instance- --namespace ns-niaem ` label "app.kubernetes.io/instance" not found. `kbcli cluster label milvus-mgvllt app.kubernetes.io/instance=milvus-mgvllt --namespace ns-niaem ` `kbcli cluster label milvus-mgvllt --list --namespace ns-niaem ` NAME NAMESPACE LABELS milvus-mgvllt ns-niaem app.kubernetes.io/instance=milvus-mgvllt clusterdefinition.kubeblocks.io/name=milvus label cluster app.kubernetes.io/instance=milvus-mgvllt Success `kbcli cluster label case.name=kbcli.test1 -l app.kubernetes.io/instance=milvus-mgvllt --namespace ns-niaem ` `kbcli cluster label milvus-mgvllt --list --namespace ns-niaem ` NAME NAMESPACE LABELS milvus-mgvllt ns-niaem app.kubernetes.io/instance=milvus-mgvllt case.name=kbcli.test1 clusterdefinition.kubeblocks.io/name=milvus label cluster case.name=kbcli.test1 Success `kbcli cluster label milvus-mgvllt case.name=kbcli.test2 --overwrite --namespace ns-niaem ` `kbcli cluster label milvus-mgvllt --list --namespace ns-niaem ` NAME NAMESPACE LABELS milvus-mgvllt ns-niaem app.kubernetes.io/instance=milvus-mgvllt case.name=kbcli.test2 clusterdefinition.kubeblocks.io/name=milvus label cluster case.name=kbcli.test2 Success `kbcli cluster label milvus-mgvllt case.name- --namespace ns-niaem ` `kbcli cluster label milvus-mgvllt --list --namespace ns-niaem ` NAME NAMESPACE LABELS milvus-mgvllt ns-niaem app.kubernetes.io/instance=milvus-mgvllt clusterdefinition.kubeblocks.io/name=milvus delete cluster label case.name Success cluster connect insert batch data by db client Error from server (NotFound): pods "test-db-client-executionloop-milvus-mgvllt" not found `kubectl patch -p '***"metadata":***"finalizers":[]***' --type=merge pods test-db-client-executionloop-milvus-mgvllt --namespace ns-niaem ` Error from server (NotFound): pods "test-db-client-executionloop-milvus-mgvllt" not found Warning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely. Error from server (NotFound): pods "test-db-client-executionloop-milvus-mgvllt" not found `kubectl get secrets -l app.kubernetes.io/instance=milvus-mgvllt` No resources found in ns-niaem namespace. Not found cluster secret DB_USERNAME:admin;DB_PASSWORD:scl0RA4b8pXc1omg;DB_PORT:19530;DB_DATABASE: apiVersion: v1 kind: Pod metadata: name: test-db-client-executionloop-milvus-mgvllt namespace: ns-niaem spec: containers: - name: test-dbclient imagePullPolicy: IfNotPresent image: docker.io/apecloud/dbclient:test args: - "--host" - "milvus-mgvllt-proxy.ns-niaem.svc.cluster.local" - "--user" - "admin" - "--password" - "scl0RA4b8pXc1omg" - "--port" - "19530" - "--dbtype" - "milvus" - "--test" - "executionloop" - "--duration" - "60" - "--interval" - "1" restartPolicy: Never `kubectl apply -f test-db-client-executionloop-milvus-mgvllt.yaml` pod/test-db-client-executionloop-milvus-mgvllt created apply test-db-client-executionloop-milvus-mgvllt.yaml Success `rm -rf test-db-client-executionloop-milvus-mgvllt.yaml` check pod status pod_status:NAME READY STATUS RESTARTS AGE test-db-client-executionloop-milvus-mgvllt 1/1 Running 0 6s pod_status:NAME READY STATUS RESTARTS AGE test-db-client-executionloop-milvus-mgvllt 1/1 Running 0 10s pod_status:NAME READY STATUS RESTARTS AGE test-db-client-executionloop-milvus-mgvllt 1/1 Running 0 15s pod_status:NAME READY STATUS RESTARTS AGE test-db-client-executionloop-milvus-mgvllt 1/1 Running 0 20s pod_status:NAME READY STATUS RESTARTS AGE test-db-client-executionloop-milvus-mgvllt 1/1 Running 0 25s pod_status:NAME READY STATUS RESTARTS AGE test-db-client-executionloop-milvus-mgvllt 1/1 Running 0 30s pod_status:NAME READY STATUS RESTARTS AGE test-db-client-executionloop-milvus-mgvllt 1/1 Running 0 36s pod_status:NAME READY STATUS RESTARTS AGE test-db-client-executionloop-milvus-mgvllt 1/1 Running 0 41s pod_status:NAME READY STATUS RESTARTS AGE test-db-client-executionloop-milvus-mgvllt 1/1 Running 0 46s pod_status:NAME READY STATUS RESTARTS AGE test-db-client-executionloop-milvus-mgvllt 1/1 Running 0 51s pod_status:NAME READY STATUS RESTARTS AGE test-db-client-executionloop-milvus-mgvllt 1/1 Running 0 56s pod_status:NAME READY STATUS RESTARTS AGE test-db-client-executionloop-milvus-mgvllt 1/1 Running 0 61s pod_status:NAME READY STATUS RESTARTS AGE test-db-client-executionloop-milvus-mgvllt 1/1 Running 0 66s pod_status:NAME READY STATUS RESTARTS AGE test-db-client-executionloop-milvus-mgvllt 1/1 Running 0 72s pod_status:NAME READY STATUS RESTARTS AGE test-db-client-executionloop-milvus-mgvllt 1/1 Running 0 77s pod_status:NAME READY STATUS RESTARTS AGE test-db-client-executionloop-milvus-mgvllt 1/1 Running 0 82s pod_status:NAME READY STATUS RESTARTS AGE test-db-client-executionloop-milvus-mgvllt 1/1 Running 0 87s pod_status:NAME READY STATUS RESTARTS AGE test-db-client-executionloop-milvus-mgvllt 1/1 Running 0 92s pod_status:NAME READY STATUS RESTARTS AGE test-db-client-executionloop-milvus-mgvllt 1/1 Running 0 97s pod_status:NAME READY STATUS RESTARTS AGE test-db-client-executionloop-milvus-mgvllt 1/1 Running 0 102s pod_status:NAME READY STATUS RESTARTS AGE test-db-client-executionloop-milvus-mgvllt 1/1 Running 0 108s pod_status:NAME READY STATUS RESTARTS AGE test-db-client-executionloop-milvus-mgvllt 1/1 Running 0 113s pod_status:NAME READY STATUS RESTARTS AGE test-db-client-executionloop-milvus-mgvllt 1/1 Running 0 118s pod_status:NAME READY STATUS RESTARTS AGE test-db-client-executionloop-milvus-mgvllt 1/1 Running 0 2m3s pod_status:NAME READY STATUS RESTARTS AGE test-db-client-executionloop-milvus-mgvllt 1/1 Running 0 2m8s pod_status:NAME READY STATUS RESTARTS AGE test-db-client-executionloop-milvus-mgvllt 1/1 Running 0 2m13s pod_status:NAME READY STATUS RESTARTS AGE test-db-client-executionloop-milvus-mgvllt 1/1 Running 0 2m18s pod_status:NAME READY STATUS RESTARTS AGE test-db-client-executionloop-milvus-mgvllt 1/1 Running 0 2m24s pod_status:NAME READY STATUS RESTARTS AGE test-db-client-executionloop-milvus-mgvllt 1/1 Running 0 2m29s pod_status:NAME READY STATUS RESTARTS AGE test-db-client-executionloop-milvus-mgvllt 1/1 Running 0 2m34s pod_status:NAME READY STATUS RESTARTS AGE test-db-client-executionloop-milvus-mgvllt 1/1 Running 0 2m39s pod_status:NAME READY STATUS RESTARTS AGE test-db-client-executionloop-milvus-mgvllt 1/1 Running 0 2m44s pod_status:NAME READY STATUS RESTARTS AGE test-db-client-executionloop-milvus-mgvllt 1/1 Running 0 2m49s pod_status:NAME READY STATUS RESTARTS AGE test-db-client-executionloop-milvus-mgvllt 1/1 Running 0 2m54s pod_status:NAME READY STATUS RESTARTS AGE test-db-client-executionloop-milvus-mgvllt 1/1 Running 0 3m pod_status:NAME READY STATUS RESTARTS AGE test-db-client-executionloop-milvus-mgvllt 1/1 Running 0 3m5s pod_status:NAME READY STATUS RESTARTS AGE test-db-client-executionloop-milvus-mgvllt 1/1 Running 0 3m10s pod_status:NAME READY STATUS RESTARTS AGE test-db-client-executionloop-milvus-mgvllt 1/1 Running 0 3m15s pod_status:NAME READY STATUS RESTARTS AGE test-db-client-executionloop-milvus-mgvllt 1/1 Running 0 3m20s pod_status:NAME READY STATUS RESTARTS AGE test-db-client-executionloop-milvus-mgvllt 1/1 Running 0 3m25s pod_status:NAME READY STATUS RESTARTS AGE test-db-client-executionloop-milvus-mgvllt 1/1 Running 0 3m31s pod_status:NAME READY STATUS RESTARTS AGE test-db-client-executionloop-milvus-mgvllt 1/1 Running 0 3m36s pod_status:NAME READY STATUS RESTARTS AGE test-db-client-executionloop-milvus-mgvllt 1/1 Running 0 3m41s pod_status:NAME READY STATUS RESTARTS AGE test-db-client-executionloop-milvus-mgvllt 1/1 Running 0 3m46s pod_status:NAME READY STATUS RESTARTS AGE test-db-client-executionloop-milvus-mgvllt 1/1 Running 0 3m51s pod_status:NAME READY STATUS RESTARTS AGE test-db-client-executionloop-milvus-mgvllt 1/1 Running 0 3m56s pod_status:NAME READY STATUS RESTARTS AGE test-db-client-executionloop-milvus-mgvllt 1/1 Running 0 4m1s pod_status:NAME READY STATUS RESTARTS AGE test-db-client-executionloop-milvus-mgvllt 1/1 Running 0 4m7s pod_status:NAME READY STATUS RESTARTS AGE test-db-client-executionloop-milvus-mgvllt 1/1 Running 0 4m12s pod_status:NAME READY STATUS RESTARTS AGE test-db-client-executionloop-milvus-mgvllt 1/1 Running 0 4m17s pod_status:NAME READY STATUS RESTARTS AGE test-db-client-executionloop-milvus-mgvllt 1/1 Running 0 4m22s pod_status:NAME READY STATUS RESTARTS AGE test-db-client-executionloop-milvus-mgvllt 1/1 Running 0 4m27s pod_status:NAME READY STATUS RESTARTS AGE test-db-client-executionloop-milvus-mgvllt 1/1 Running 0 4m32s pod_status:NAME READY STATUS RESTARTS AGE test-db-client-executionloop-milvus-mgvllt 1/1 Running 0 4m37s pod_status:NAME READY STATUS RESTARTS AGE test-db-client-executionloop-milvus-mgvllt 1/1 Running 0 4m43s pod_status:NAME READY STATUS RESTARTS AGE test-db-client-executionloop-milvus-mgvllt 1/1 Running 0 4m48s pod_status:NAME READY STATUS RESTARTS AGE test-db-client-executionloop-milvus-mgvllt 1/1 Running 0 4m53s pod_status:NAME READY STATUS RESTARTS AGE test-db-client-executionloop-milvus-mgvllt 1/1 Running 0 4m58s pod_status:NAME READY STATUS RESTARTS AGE test-db-client-executionloop-milvus-mgvllt 1/1 Running 0 5m3s pod_status:NAME READY STATUS RESTARTS AGE test-db-client-executionloop-milvus-mgvllt 1/1 Running 0 5m8s pod_status:NAME READY STATUS RESTARTS AGE test-db-client-executionloop-milvus-mgvllt 1/1 Running 0 5m13s check pod test-db-client-executionloop-milvus-mgvllt status timeout --------------------------------------get pod test-db-client-executionloop-milvus-mgvllt yaml-------------------------------------- `kubectl get pod test-db-client-executionloop-milvus-mgvllt -o yaml --namespace ns-niaem ` apiVersion: v1 kind: Pod metadata: annotations: kubectl.kubernetes.io/last-applied-configuration: | ***"apiVersion":"v1","kind":"Pod","metadata":***"annotations":***,"name":"test-db-client-executionloop-milvus-mgvllt","namespace":"ns-niaem"***,"spec":***"containers":[***"args":["--host","milvus-mgvllt-proxy.ns-niaem.svc.cluster.local","--user","admin","--password","scl0RA4b8pXc1omg","--port","19530","--dbtype","milvus","--test","executionloop","--duration","60","--interval","1"],"image":"docker.io/apecloud/dbclient:test","imagePullPolicy":"IfNotPresent","name":"test-dbclient"***],"restartPolicy":"Never"*** creationTimestamp: "2025-09-11T10:26:33Z" name: test-db-client-executionloop-milvus-mgvllt namespace: ns-niaem resourceVersion: "58671" uid: 8fcfd535-9898-4dda-8cc4-71aea00e59f6 spec: containers: - args: - --host - milvus-mgvllt-proxy.ns-niaem.svc.cluster.local - --user - admin - --password - scl0RA4b8pXc1omg - --port - "19530" - --dbtype - milvus - --test - executionloop - --duration - "60" - --interval - "1" image: docker.io/apecloud/dbclient:test imagePullPolicy: IfNotPresent name: test-dbclient resources: *** terminationMessagePath: /dev/termination-log terminationMessagePolicy: File volumeMounts: - mountPath: /var/run/secrets/kubernetes.io/serviceaccount name: kube-api-access-g797f readOnly: true dnsPolicy: ClusterFirst enableServiceLinks: true nodeName: aks-cicdamdpool-12089392-vmss000000 preemptionPolicy: PreemptLowerPriority priority: 0 restartPolicy: Never schedulerName: default-scheduler securityContext: *** serviceAccount: default serviceAccountName: default terminationGracePeriodSeconds: 30 tolerations: - effect: NoExecute key: node.kubernetes.io/not-ready operator: Exists tolerationSeconds: 300 - effect: NoExecute key: node.kubernetes.io/unreachable operator: Exists tolerationSeconds: 300 volumes: - name: kube-api-access-g797f projected: defaultMode: 420 sources: - serviceAccountToken: expirationSeconds: 3607 path: token - configMap: items: - key: ca.crt path: ca.crt name: kube-root-ca.crt - downwardAPI: items: - fieldRef: apiVersion: v1 fieldPath: metadata.namespace path: namespace status: conditions: - lastProbeTime: null lastTransitionTime: "2025-09-11T10:26:35Z" status: "True" type: PodReadyToStartContainers - lastProbeTime: null lastTransitionTime: "2025-09-11T10:26:33Z" status: "True" type: Initialized - lastProbeTime: null lastTransitionTime: "2025-09-11T10:26:35Z" status: "True" type: Ready - lastProbeTime: null lastTransitionTime: "2025-09-11T10:26:35Z" status: "True" type: ContainersReady - lastProbeTime: null lastTransitionTime: "2025-09-11T10:26:33Z" status: "True" type: PodScheduled containerStatuses: - containerID: containerd://bd40095456742b4f5f2555d0d0151bdc1f07a7aff86d800bedabef2b1386120c image: docker.io/apecloud/dbclient:test imageID: docker.io/apecloud/dbclient@sha256:abec5208c25fc18470ec383a06c35e29186314821be5151efcb82612aee1dcc1 lastState: *** name: test-dbclient ready: true restartCount: 0 started: true state: running: startedAt: "2025-09-11T10:26:34Z" volumeMounts: - mountPath: /var/run/secrets/kubernetes.io/serviceaccount name: kube-api-access-g797f readOnly: true recursiveReadOnly: Disabled hostIP: 10.224.0.5 hostIPs: - ip: 10.224.0.5 phase: Running podIP: 10.244.4.126 podIPs: - ip: 10.244.4.126 qosClass: BestEffort startTime: "2025-09-11T10:26:33Z" ------------------------------------------------------------------------------------------------------------------ --------------------------------------describe pod test-db-client-executionloop-milvus-mgvllt-------------------------------------- `kubectl describe pod test-db-client-executionloop-milvus-mgvllt --namespace ns-niaem ` Name: test-db-client-executionloop-milvus-mgvllt Namespace: ns-niaem Priority: 0 Service Account: default Node: aks-cicdamdpool-12089392-vmss000000/10.224.0.5 Start Time: Thu, 11 Sep 2025 18:26:33 +0800 Labels: Annotations: Status: Running IP: 10.244.4.126 IPs: IP: 10.244.4.126 Containers: test-dbclient: Container ID: containerd://bd40095456742b4f5f2555d0d0151bdc1f07a7aff86d800bedabef2b1386120c Image: docker.io/apecloud/dbclient:test Image ID: docker.io/apecloud/dbclient@sha256:abec5208c25fc18470ec383a06c35e29186314821be5151efcb82612aee1dcc1 Port: Host Port: Args: --host milvus-mgvllt-proxy.ns-niaem.svc.cluster.local --user admin --password scl0RA4b8pXc1omg --port 19530 --dbtype milvus --test executionloop --duration 60 --interval 1 State: Running Started: Thu, 11 Sep 2025 18:26:34 +0800 Ready: True Restart Count: 0 Environment: Mounts: /var/run/secrets/kubernetes.io/serviceaccount from kube-api-access-g797f (ro) Conditions: Type Status PodReadyToStartContainers True Initialized True Ready True ContainersReady True PodScheduled True Volumes: kube-api-access-g797f: Type: Projected (a volume that contains injected data from multiple sources) TokenExpirationSeconds: 3607 ConfigMapName: kube-root-ca.crt ConfigMapOptional: DownwardAPI: true QoS Class: BestEffort Node-Selectors: Tolerations: node.kubernetes.io/not-ready:NoExecute op=Exists for 300s node.kubernetes.io/unreachable:NoExecute op=Exists for 300s Events: Type Reason Age From Message ---- ------ ---- ---- ------- Normal Scheduled 5m14s default-scheduler Successfully assigned ns-niaem/test-db-client-executionloop-milvus-mgvllt to aks-cicdamdpool-12089392-vmss000000 Normal Pulled 5m14s kubelet Container image "docker.io/apecloud/dbclient:test" already present on machine Normal Created 5m14s kubelet Created container: test-dbclient Normal Started 5m14s kubelet Started container test-dbclient ------------------------------------------------------------------------------------------------------------------ --------------------------------------pod test-db-client-executionloop-milvus-mgvllt-------------------------------------- `kubectl logs test-db-client-executionloop-milvus-mgvllt --namespace ns-niaem --tail 500` --host milvus-mgvllt-proxy.ns-niaem.svc.cluster.local --user admin --password scl0RA4b8pXc1omg --port 19530 --dbtype milvus --test executionloop --duration 60 --interval 1 SLF4J(I): Connected with provider of type [ch.qos.logback.classic.spi.LogbackServiceProvider] Execution loop start: Collection executions_loop_collection does not exist. Creating collection... ------------------------------------------------------------------------------------------------------------------ check cluster status `kbcli cluster list milvus-mgvllt --show-labels --namespace ns-niaem ` NAME NAMESPACE CLUSTER-DEFINITION TERMINATION-POLICY STATUS CREATED-TIME LABELS milvus-mgvllt ns-niaem milvus Delete Running Sep 11,2025 18:25 UTC+0800 app.kubernetes.io/instance=milvus-mgvllt,clusterdefinition.kubeblocks.io/name=milvus check cluster status done cluster_status:Running check pod status `kbcli cluster list-instances milvus-mgvllt --namespace ns-niaem ` NAME NAMESPACE CLUSTER COMPONENT STATUS ROLE ACCESSMODE AZ CPU(REQUEST/LIMIT) MEMORY(REQUEST/LIMIT) STORAGE NODE CREATED-TIME milvus-mgvllt-datanode-0 ns-niaem milvus-mgvllt datanode Running 0 500m / 500m 512Mi / 512Mi aks-cicdamdpool-12089392-vmss000003/10.224.0.8 Sep 11,2025 18:25 UTC+0800 milvus-mgvllt-indexnode-0 ns-niaem milvus-mgvllt indexnode Running 0 500m / 500m 512Mi / 512Mi aks-cicdamdpool-12089392-vmss000001/10.224.0.7 Sep 11,2025 18:25 UTC+0800 milvus-mgvllt-mixcoord-0 ns-niaem milvus-mgvllt mixcoord Running 0 500m / 500m 512Mi / 512Mi aks-cicdamdpool-12089392-vmss000001/10.224.0.7 Sep 11,2025 18:25 UTC+0800 milvus-mgvllt-proxy-0 ns-niaem milvus-mgvllt proxy Running 0 500m / 500m 512Mi / 512Mi aks-cicdamdpool-12089392-vmss000003/10.224.0.8 Sep 11,2025 18:25 UTC+0800 milvus-mgvllt-querynode-0 ns-niaem milvus-mgvllt querynode Running 0 500m / 500m 512Mi / 512Mi aks-cicdamdpool-12089392-vmss000003/10.224.0.8 Sep 11,2025 18:25 UTC+0800 check pod status done connect unsupported engine type: milvus milvus-standalone-1.0.1 --host milvus-mgvllt-proxy.ns-niaem.svc.cluster.local --user admin --password scl0RA4b8pXc1omg --port 19530 --dbtype milvus --test executionloop --duration 60 --interval 1 SLF4J(I): Connected with provider of type [ch.qos.logback.classic.spi.LogbackServiceProvider] Execution loop start: Collection executions_loop_collection does not exist. Creating collection... DB_CLIENT_BATCH_DATA_COUNT: `kubectl patch -p '***"metadata":***"finalizers":[]***' --type=merge pods test-db-client-executionloop-milvus-mgvllt --namespace ns-niaem ` pod/test-db-client-executionloop-milvus-mgvllt patched (no change) Warning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely. pod "test-db-client-executionloop-milvus-mgvllt" force deleted check component querynode exists `kubectl get components -l app.kubernetes.io/instance=milvus-mgvllt,apps.kubeblocks.io/component-name=querynode --namespace ns-niaem | (grep "querynode" || true )` check cluster status before ops check cluster status done cluster_status:Running `kbcli cluster vscale milvus-mgvllt --auto-approve --force=true --components querynode --cpu 600m --memory 0.6Gi --namespace ns-niaem ` OpsRequest milvus-mgvllt-verticalscaling-sptdn created successfully, you can view the progress: kbcli cluster describe-ops milvus-mgvllt-verticalscaling-sptdn -n ns-niaem check ops status `kbcli cluster list-ops milvus-mgvllt --status all --namespace ns-niaem ` NAME NAMESPACE TYPE CLUSTER COMPONENT STATUS PROGRESS CREATED-TIME milvus-mgvllt-verticalscaling-sptdn ns-niaem VerticalScaling milvus-mgvllt querynode Running -/- Sep 11,2025 18:31 UTC+0800 check cluster status `kbcli cluster list milvus-mgvllt --show-labels --namespace ns-niaem ` NAME NAMESPACE CLUSTER-DEFINITION TERMINATION-POLICY STATUS CREATED-TIME LABELS milvus-mgvllt ns-niaem milvus Delete Updating Sep 11,2025 18:25 UTC+0800 app.kubernetes.io/instance=milvus-mgvllt,clusterdefinition.kubeblocks.io/name=milvus cluster_status:Updating cluster_status:Updating cluster_status:Updating cluster_status:Updating cluster_status:Updating cluster_status:Updating cluster_status:Updating check cluster status done cluster_status:Running check pod status `kbcli cluster list-instances milvus-mgvllt --namespace ns-niaem ` NAME NAMESPACE CLUSTER COMPONENT STATUS ROLE ACCESSMODE AZ CPU(REQUEST/LIMIT) MEMORY(REQUEST/LIMIT) STORAGE NODE CREATED-TIME milvus-mgvllt-datanode-0 ns-niaem milvus-mgvllt datanode Running 0 500m / 500m 512Mi / 512Mi aks-cicdamdpool-12089392-vmss000003/10.224.0.8 Sep 11,2025 18:25 UTC+0800 milvus-mgvllt-indexnode-0 ns-niaem milvus-mgvllt indexnode Running 0 500m / 500m 512Mi / 512Mi aks-cicdamdpool-12089392-vmss000001/10.224.0.7 Sep 11,2025 18:25 UTC+0800 milvus-mgvllt-mixcoord-0 ns-niaem milvus-mgvllt mixcoord Running 0 500m / 500m 512Mi / 512Mi aks-cicdamdpool-12089392-vmss000001/10.224.0.7 Sep 11,2025 18:25 UTC+0800 milvus-mgvllt-proxy-0 ns-niaem milvus-mgvllt proxy Running 0 500m / 500m 512Mi / 512Mi aks-cicdamdpool-12089392-vmss000003/10.224.0.8 Sep 11,2025 18:25 UTC+0800 milvus-mgvllt-querynode-0 ns-niaem milvus-mgvllt querynode Running 0 600m / 600m 644245094400m / 644245094400m aks-cicdamdpool-12089392-vmss000003/10.224.0.8 Sep 11,2025 18:31 UTC+0800 check pod status done connect unsupported engine type: milvus milvus-standalone-1.0.1 check ops status `kbcli cluster list-ops milvus-mgvllt --status all --namespace ns-niaem ` NAME NAMESPACE TYPE CLUSTER COMPONENT STATUS PROGRESS CREATED-TIME milvus-mgvllt-verticalscaling-sptdn ns-niaem VerticalScaling milvus-mgvllt querynode Succeed 1/1 Sep 11,2025 18:31 UTC+0800 check ops status done ops_status:milvus-mgvllt-verticalscaling-sptdn ns-niaem VerticalScaling milvus-mgvllt querynode Succeed 1/1 Sep 11,2025 18:31 UTC+0800 `kubectl patch -p '***"metadata":***"finalizers":[]***' --type=merge opsrequests.operations milvus-mgvllt-verticalscaling-sptdn --namespace ns-niaem ` opsrequest.operations.kubeblocks.io/milvus-mgvllt-verticalscaling-sptdn patched `kbcli cluster delete-ops --name milvus-mgvllt-verticalscaling-sptdn --force --auto-approve --namespace ns-niaem ` OpsRequest milvus-mgvllt-verticalscaling-sptdn deleted cluster stop check cluster status before ops check cluster status done cluster_status:Running `kbcli cluster stop milvus-mgvllt --auto-approve --force=true --namespace ns-niaem ` OpsRequest milvus-mgvllt-stop-wf7nx created successfully, you can view the progress: kbcli cluster describe-ops milvus-mgvllt-stop-wf7nx -n ns-niaem check ops status `kbcli cluster list-ops milvus-mgvllt --status all --namespace ns-niaem ` NAME NAMESPACE TYPE CLUSTER COMPONENT STATUS PROGRESS CREATED-TIME milvus-mgvllt-stop-wf7nx ns-niaem Stop milvus-mgvllt Creating -/- Sep 11,2025 18:32 UTC+0800 check cluster status `kbcli cluster list milvus-mgvllt --show-labels --namespace ns-niaem ` NAME NAMESPACE CLUSTER-DEFINITION TERMINATION-POLICY STATUS CREATED-TIME LABELS milvus-mgvllt ns-niaem milvus Delete Stopping Sep 11,2025 18:25 UTC+0800 app.kubernetes.io/instance=milvus-mgvllt,clusterdefinition.kubeblocks.io/name=milvus cluster_status:Stopping cluster_status:Stopping cluster_status:Stopping cluster_status:Stopping cluster_status:Stopping check cluster status done cluster_status:Stopped check pod status `kbcli cluster list-instances milvus-mgvllt --namespace ns-niaem ` NAME NAMESPACE CLUSTER COMPONENT STATUS ROLE ACCESSMODE AZ CPU(REQUEST/LIMIT) MEMORY(REQUEST/LIMIT) STORAGE NODE CREATED-TIME check pod status done check ops status `kbcli cluster list-ops milvus-mgvllt --status all --namespace ns-niaem ` NAME NAMESPACE TYPE CLUSTER COMPONENT STATUS PROGRESS CREATED-TIME milvus-mgvllt-stop-wf7nx ns-niaem Stop milvus-mgvllt datanode,indexnode,mixcoord,proxy,querynode Succeed 5/5 Sep 11,2025 18:32 UTC+0800 check ops status done ops_status:milvus-mgvllt-stop-wf7nx ns-niaem Stop milvus-mgvllt datanode,indexnode,mixcoord,proxy,querynode Succeed 5/5 Sep 11,2025 18:32 UTC+0800 `kubectl patch -p '***"metadata":***"finalizers":[]***' --type=merge opsrequests.operations milvus-mgvllt-stop-wf7nx --namespace ns-niaem ` opsrequest.operations.kubeblocks.io/milvus-mgvllt-stop-wf7nx patched `kbcli cluster delete-ops --name milvus-mgvllt-stop-wf7nx --force --auto-approve --namespace ns-niaem ` OpsRequest milvus-mgvllt-stop-wf7nx deleted cluster start check cluster status before ops check cluster status done cluster_status:Stopped `kbcli cluster start milvus-mgvllt --force=true --namespace ns-niaem ` OpsRequest milvus-mgvllt-start-nlbnz created successfully, you can view the progress: kbcli cluster describe-ops milvus-mgvllt-start-nlbnz -n ns-niaem check ops status `kbcli cluster list-ops milvus-mgvllt --status all --namespace ns-niaem ` NAME NAMESPACE TYPE CLUSTER COMPONENT STATUS PROGRESS CREATED-TIME milvus-mgvllt-start-nlbnz ns-niaem Start milvus-mgvllt Running -/- Sep 11,2025 18:33 UTC+0800 check cluster status `kbcli cluster list milvus-mgvllt --show-labels --namespace ns-niaem ` NAME NAMESPACE CLUSTER-DEFINITION TERMINATION-POLICY STATUS CREATED-TIME LABELS milvus-mgvllt ns-niaem milvus Delete Updating Sep 11,2025 18:25 UTC+0800 app.kubernetes.io/instance=milvus-mgvllt,clusterdefinition.kubeblocks.io/name=milvus cluster_status:Updating cluster_status:Updating cluster_status:Updating check cluster status done cluster_status:Running check pod status `kbcli cluster list-instances milvus-mgvllt --namespace ns-niaem ` NAME NAMESPACE CLUSTER COMPONENT STATUS ROLE ACCESSMODE AZ CPU(REQUEST/LIMIT) MEMORY(REQUEST/LIMIT) STORAGE NODE CREATED-TIME milvus-mgvllt-datanode-0 ns-niaem milvus-mgvllt datanode Running 0 500m / 500m 512Mi / 512Mi aks-cicdamdpool-12089392-vmss000001/10.224.0.7 Sep 11,2025 18:33 UTC+0800 milvus-mgvllt-indexnode-0 ns-niaem milvus-mgvllt indexnode Running 0 500m / 500m 512Mi / 512Mi aks-cicdamdpool-12089392-vmss000003/10.224.0.8 Sep 11,2025 18:33 UTC+0800 milvus-mgvllt-mixcoord-0 ns-niaem milvus-mgvllt mixcoord Running 0 500m / 500m 512Mi / 512Mi aks-cicdamdpool-12089392-vmss000003/10.224.0.8 Sep 11,2025 18:33 UTC+0800 milvus-mgvllt-proxy-0 ns-niaem milvus-mgvllt proxy Running 0 500m / 500m 512Mi / 512Mi aks-cicdamdpool-12089392-vmss000003/10.224.0.8 Sep 11,2025 18:33 UTC+0800 milvus-mgvllt-querynode-0 ns-niaem milvus-mgvllt querynode Running 0 600m / 600m 644245094400m / 644245094400m aks-cicdamdpool-12089392-vmss000001/10.224.0.7 Sep 11,2025 18:33 UTC+0800 check pod status done connect unsupported engine type: milvus milvus-standalone-1.0.1 check ops status `kbcli cluster list-ops milvus-mgvllt --status all --namespace ns-niaem ` NAME NAMESPACE TYPE CLUSTER COMPONENT STATUS PROGRESS CREATED-TIME milvus-mgvllt-start-nlbnz ns-niaem Start milvus-mgvllt datanode,indexnode,mixcoord,proxy,querynode Succeed 5/5 Sep 11,2025 18:33 UTC+0800 check ops status done ops_status:milvus-mgvllt-start-nlbnz ns-niaem Start milvus-mgvllt datanode,indexnode,mixcoord,proxy,querynode Succeed 5/5 Sep 11,2025 18:33 UTC+0800 `kubectl patch -p '***"metadata":***"finalizers":[]***' --type=merge opsrequests.operations milvus-mgvllt-start-nlbnz --namespace ns-niaem ` opsrequest.operations.kubeblocks.io/milvus-mgvllt-start-nlbnz patched `kbcli cluster delete-ops --name milvus-mgvllt-start-nlbnz --force --auto-approve --namespace ns-niaem ` OpsRequest milvus-mgvllt-start-nlbnz deleted test failover connectionstress check cluster status before cluster-failover-connectionstress check cluster status done cluster_status:Running Error from server (NotFound): pods "test-db-client-connectionstress-milvus-mgvllt" not found `kubectl patch -p '***"metadata":***"finalizers":[]***' --type=merge pods test-db-client-connectionstress-milvus-mgvllt --namespace ns-niaem ` Error from server (NotFound): pods "test-db-client-connectionstress-milvus-mgvllt" not found Warning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely. Error from server (NotFound): pods "test-db-client-connectionstress-milvus-mgvllt" not found `kubectl get secrets -l app.kubernetes.io/instance=milvus-mgvllt` No resources found in ns-niaem namespace. Not found cluster secret DB_USERNAME:admin;DB_PASSWORD:scl0RA4b8pXc1omg;DB_PORT:19530;DB_DATABASE: apiVersion: v1 kind: Pod metadata: name: test-db-client-connectionstress-milvus-mgvllt namespace: ns-niaem spec: containers: - name: test-dbclient imagePullPolicy: IfNotPresent image: docker.io/apecloud/dbclient:test args: - "--host" - "milvus-mgvllt-proxy.ns-niaem.svc.cluster.local" - "--user" - "admin" - "--password" - "scl0RA4b8pXc1omg" - "--port" - "19530" - "--database" - "" - "--dbtype" - "milvus" - "--test" - "connectionstress" - "--connections" - "4096" - "--duration" - "60" restartPolicy: Never `kubectl apply -f test-db-client-connectionstress-milvus-mgvllt.yaml` pod/test-db-client-connectionstress-milvus-mgvllt created apply test-db-client-connectionstress-milvus-mgvllt.yaml Success `rm -rf test-db-client-connectionstress-milvus-mgvllt.yaml` check pod status pod_status:NAME READY STATUS RESTARTS AGE test-db-client-connectionstress-milvus-mgvllt 1/1 Running 0 5s pod_status:NAME READY STATUS RESTARTS AGE test-db-client-connectionstress-milvus-mgvllt 1/1 Running 0 10s pod_status:NAME READY STATUS RESTARTS AGE test-db-client-connectionstress-milvus-mgvllt 1/1 Running 0 15s pod_status:NAME READY STATUS RESTARTS AGE test-db-client-connectionstress-milvus-mgvllt 1/1 Running 0 20s pod_status:NAME READY STATUS RESTARTS AGE test-db-client-connectionstress-milvus-mgvllt 1/1 Running 0 25s pod_status:NAME READY STATUS RESTARTS AGE test-db-client-connectionstress-milvus-mgvllt 1/1 Running 0 30s pod_status:NAME READY STATUS RESTARTS AGE test-db-client-connectionstress-milvus-mgvllt 1/1 Running 0 35s pod_status:NAME READY STATUS RESTARTS AGE test-db-client-connectionstress-milvus-mgvllt 1/1 Running 0 40s pod_status:NAME READY STATUS RESTARTS AGE test-db-client-connectionstress-milvus-mgvllt 1/1 Running 0 46s pod_status:NAME READY STATUS RESTARTS AGE test-db-client-connectionstress-milvus-mgvllt 1/1 Running 0 51s pod_status:NAME READY STATUS RESTARTS AGE test-db-client-connectionstress-milvus-mgvllt 1/1 Running 0 56s pod_status:NAME READY STATUS RESTARTS AGE test-db-client-connectionstress-milvus-mgvllt 1/1 Running 0 61s pod_status:NAME READY STATUS RESTARTS AGE test-db-client-connectionstress-milvus-mgvllt 1/1 Running 0 66s pod_status:NAME READY STATUS RESTARTS AGE test-db-client-connectionstress-milvus-mgvllt 1/1 Running 0 71s pod_status:NAME READY STATUS RESTARTS AGE test-db-client-connectionstress-milvus-mgvllt 1/1 Running 0 76s pod_status:NAME READY STATUS RESTARTS AGE test-db-client-connectionstress-milvus-mgvllt 1/1 Running 0 82s pod_status:NAME READY STATUS RESTARTS AGE test-db-client-connectionstress-milvus-mgvllt 1/1 Running 0 87s pod_status:NAME READY STATUS RESTARTS AGE test-db-client-connectionstress-milvus-mgvllt 1/1 Running 0 92s pod_status:NAME READY STATUS RESTARTS AGE test-db-client-connectionstress-milvus-mgvllt 1/1 Running 0 97s pod_status:NAME READY STATUS RESTARTS AGE test-db-client-connectionstress-milvus-mgvllt 1/1 Running 0 102s pod_status:NAME READY STATUS RESTARTS AGE test-db-client-connectionstress-milvus-mgvllt 1/1 Running 0 107s pod_status:NAME READY STATUS RESTARTS AGE test-db-client-connectionstress-milvus-mgvllt 1/1 Running 0 113s pod_status:NAME READY STATUS RESTARTS AGE test-db-client-connectionstress-milvus-mgvllt 1/1 Running 0 118s pod_status:NAME READY STATUS RESTARTS AGE test-db-client-connectionstress-milvus-mgvllt 1/1 Running 0 2m3s check pod test-db-client-connectionstress-milvus-mgvllt status done pod_status:NAME READY STATUS RESTARTS AGE test-db-client-connectionstress-milvus-mgvllt 0/1 Completed 0 2m8s check cluster status `kbcli cluster list milvus-mgvllt --show-labels --namespace ns-niaem ` NAME NAMESPACE CLUSTER-DEFINITION TERMINATION-POLICY STATUS CREATED-TIME LABELS milvus-mgvllt ns-niaem milvus Delete Running Sep 11,2025 18:25 UTC+0800 app.kubernetes.io/instance=milvus-mgvllt,clusterdefinition.kubeblocks.io/name=milvus check cluster status done cluster_status:Running check pod status `kbcli cluster list-instances milvus-mgvllt --namespace ns-niaem ` NAME NAMESPACE CLUSTER COMPONENT STATUS ROLE ACCESSMODE AZ CPU(REQUEST/LIMIT) MEMORY(REQUEST/LIMIT) STORAGE NODE CREATED-TIME milvus-mgvllt-datanode-0 ns-niaem milvus-mgvllt datanode Running 0 500m / 500m 512Mi / 512Mi aks-cicdamdpool-12089392-vmss000001/10.224.0.7 Sep 11,2025 18:33 UTC+0800 milvus-mgvllt-indexnode-0 ns-niaem milvus-mgvllt indexnode Running 0 500m / 500m 512Mi / 512Mi aks-cicdamdpool-12089392-vmss000003/10.224.0.8 Sep 11,2025 18:33 UTC+0800 milvus-mgvllt-mixcoord-0 ns-niaem milvus-mgvllt mixcoord Running 0 500m / 500m 512Mi / 512Mi aks-cicdamdpool-12089392-vmss000003/10.224.0.8 Sep 11,2025 18:33 UTC+0800 milvus-mgvllt-proxy-0 ns-niaem milvus-mgvllt proxy Running 0 500m / 500m 512Mi / 512Mi aks-cicdamdpool-12089392-vmss000003/10.224.0.8 Sep 11,2025 18:33 UTC+0800 milvus-mgvllt-querynode-0 ns-niaem milvus-mgvllt querynode Running 0 600m / 600m 644245094400m / 644245094400m aks-cicdamdpool-12089392-vmss000001/10.224.0.7 Sep 11,2025 18:33 UTC+0800 check pod status done connect unsupported engine type: milvus milvus-standalone-1.0.1 --host milvus-mgvllt-proxy.ns-niaem.svc.cluster.local --user admin --password scl0RA4b8pXc1omg --port 19530 --database --dbtype milvus --test connectionstress --connections 4096 --duration 60 SLF4J(I): Connected with provider of type [ch.qos.logback.classic.spi.LogbackServiceProvider] Test Result: Connection stress test results: Attempted connections: 4096 Successful connections: 4096 Test duration: 60 seconds Connection Information: Database Type: milvus Host: milvus-mgvllt-proxy.ns-niaem.svc.cluster.local Port: 19530 Database: Table: User: admin Org: Access Mode: mysql Test Type: connectionstress Connection Count: 4096 Duration: 60 seconds `kubectl patch -p '***"metadata":***"finalizers":[]***' --type=merge pods test-db-client-connectionstress-milvus-mgvllt --namespace ns-niaem ` pod/test-db-client-connectionstress-milvus-mgvllt patched (no change) Warning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely. pod "test-db-client-connectionstress-milvus-mgvllt" force deleted check failover pod name failover pod name:milvus-mgvllt-proxy-0 failover connectionstress Success check component datanode exists `kubectl get components -l app.kubernetes.io/instance=milvus-mgvllt,apps.kubeblocks.io/component-name=datanode --namespace ns-niaem | (grep "datanode" || true )` check component mixcoord exists `kubectl get components -l app.kubernetes.io/instance=milvus-mgvllt,apps.kubeblocks.io/component-name=mixcoord --namespace ns-niaem | (grep "mixcoord" || true )` check component proxy exists `kubectl get components -l app.kubernetes.io/instance=milvus-mgvllt,apps.kubeblocks.io/component-name=proxy --namespace ns-niaem | (grep "proxy" || true )` check cluster status before ops check cluster status done cluster_status:Running `kbcli cluster vscale milvus-mgvllt --auto-approve --force=true --components datanode,mixcoord,proxy --cpu 600m --memory 0.6Gi --namespace ns-niaem ` OpsRequest milvus-mgvllt-verticalscaling-97mw5 created successfully, you can view the progress: kbcli cluster describe-ops milvus-mgvllt-verticalscaling-97mw5 -n ns-niaem check ops status `kbcli cluster list-ops milvus-mgvllt --status all --namespace ns-niaem ` NAME NAMESPACE TYPE CLUSTER COMPONENT STATUS PROGRESS CREATED-TIME milvus-mgvllt-verticalscaling-97mw5 ns-niaem VerticalScaling milvus-mgvllt datanode,mixcoord,proxy Running -/- Sep 11,2025 18:36 UTC+0800 check cluster status `kbcli cluster list milvus-mgvllt --show-labels --namespace ns-niaem ` NAME NAMESPACE CLUSTER-DEFINITION TERMINATION-POLICY STATUS CREATED-TIME LABELS milvus-mgvllt ns-niaem milvus Delete Updating Sep 11,2025 18:25 UTC+0800 app.kubernetes.io/instance=milvus-mgvllt,clusterdefinition.kubeblocks.io/name=milvus cluster_status:Updating cluster_status:Updating cluster_status:Updating cluster_status:Updating cluster_status:Updating cluster_status:Updating cluster_status:Updating cluster_status:Updating cluster_status:Updating cluster_status:Updating cluster_status:Updating cluster_status:Updating cluster_status:Updating check cluster status done cluster_status:Running check pod status `kbcli cluster list-instances milvus-mgvllt --namespace ns-niaem ` NAME NAMESPACE CLUSTER COMPONENT STATUS ROLE ACCESSMODE AZ CPU(REQUEST/LIMIT) MEMORY(REQUEST/LIMIT) STORAGE NODE CREATED-TIME milvus-mgvllt-datanode-0 ns-niaem milvus-mgvllt datanode Running 0 600m / 600m 644245094400m / 644245094400m aks-cicdamdpool-12089392-vmss000001/10.224.0.7 Sep 11,2025 18:36 UTC+0800 milvus-mgvllt-indexnode-0 ns-niaem milvus-mgvllt indexnode Running 0 500m / 500m 512Mi / 512Mi aks-cicdamdpool-12089392-vmss000003/10.224.0.8 Sep 11,2025 18:33 UTC+0800 milvus-mgvllt-mixcoord-0 ns-niaem milvus-mgvllt mixcoord Running 0 600m / 600m 644245094400m / 644245094400m aks-cicdamdpool-12089392-vmss000003/10.224.0.8 Sep 11,2025 18:37 UTC+0800 milvus-mgvllt-proxy-0 ns-niaem milvus-mgvllt proxy Running 0 600m / 600m 644245094400m / 644245094400m aks-cicdamdpool-12089392-vmss000003/10.224.0.8 Sep 11,2025 18:37 UTC+0800 milvus-mgvllt-querynode-0 ns-niaem milvus-mgvllt querynode Running 0 600m / 600m 644245094400m / 644245094400m aks-cicdamdpool-12089392-vmss000001/10.224.0.7 Sep 11,2025 18:33 UTC+0800 check pod status done connect unsupported engine type: milvus milvus-standalone-1.0.1 check ops status `kbcli cluster list-ops milvus-mgvllt --status all --namespace ns-niaem ` NAME NAMESPACE TYPE CLUSTER COMPONENT STATUS PROGRESS CREATED-TIME milvus-mgvllt-verticalscaling-97mw5 ns-niaem VerticalScaling milvus-mgvllt datanode,mixcoord,proxy Succeed 3/3 Sep 11,2025 18:36 UTC+0800 check ops status done ops_status:milvus-mgvllt-verticalscaling-97mw5 ns-niaem VerticalScaling milvus-mgvllt datanode,mixcoord,proxy Succeed 3/3 Sep 11,2025 18:36 UTC+0800 `kubectl patch -p '***"metadata":***"finalizers":[]***' --type=merge opsrequests.operations milvus-mgvllt-verticalscaling-97mw5 --namespace ns-niaem ` opsrequest.operations.kubeblocks.io/milvus-mgvllt-verticalscaling-97mw5 patched `kbcli cluster delete-ops --name milvus-mgvllt-verticalscaling-97mw5 --force --auto-approve --namespace ns-niaem ` OpsRequest milvus-mgvllt-verticalscaling-97mw5 deleted cluster restart check cluster status before ops check cluster status done cluster_status:Running `kbcli cluster restart milvus-mgvllt --auto-approve --force=true --namespace ns-niaem ` OpsRequest milvus-mgvllt-restart-bgk4p created successfully, you can view the progress: kbcli cluster describe-ops milvus-mgvllt-restart-bgk4p -n ns-niaem check ops status `kbcli cluster list-ops milvus-mgvllt --status all --namespace ns-niaem ` NAME NAMESPACE TYPE CLUSTER COMPONENT STATUS PROGRESS CREATED-TIME milvus-mgvllt-restart-bgk4p ns-niaem Restart milvus-mgvllt proxy,mixcoord,datanode,indexnode,querynode Running 0/5 Sep 11,2025 18:38 UTC+0800 check cluster status `kbcli cluster list milvus-mgvllt --show-labels --namespace ns-niaem ` NAME NAMESPACE CLUSTER-DEFINITION TERMINATION-POLICY STATUS CREATED-TIME LABELS milvus-mgvllt ns-niaem milvus Delete Updating Sep 11,2025 18:25 UTC+0800 app.kubernetes.io/instance=milvus-mgvllt,clusterdefinition.kubeblocks.io/name=milvus cluster_status:Updating cluster_status:Updating cluster_status:Updating cluster_status:Updating cluster_status:Updating cluster_status:Updating cluster_status:Updating cluster_status:Updating cluster_status:Updating cluster_status:Updating cluster_status:Updating cluster_status:Updating cluster_status:Updating cluster_status:Updating cluster_status:Updating cluster_status:Updating check cluster status done cluster_status:Running check pod status `kbcli cluster list-instances milvus-mgvllt --namespace ns-niaem ` NAME NAMESPACE CLUSTER COMPONENT STATUS ROLE ACCESSMODE AZ CPU(REQUEST/LIMIT) MEMORY(REQUEST/LIMIT) STORAGE NODE CREATED-TIME milvus-mgvllt-datanode-0 ns-niaem milvus-mgvllt datanode Running 0 600m / 600m 644245094400m / 644245094400m aks-cicdamdpool-12089392-vmss000001/10.224.0.7 Sep 11,2025 18:38 UTC+0800 milvus-mgvllt-indexnode-0 ns-niaem milvus-mgvllt indexnode Running 0 500m / 500m 512Mi / 512Mi aks-cicdamdpool-12089392-vmss000003/10.224.0.8 Sep 11,2025 18:38 UTC+0800 milvus-mgvllt-mixcoord-0 ns-niaem milvus-mgvllt mixcoord Running 0 600m / 600m 644245094400m / 644245094400m aks-cicdamdpool-12089392-vmss000003/10.224.0.8 Sep 11,2025 18:38 UTC+0800 milvus-mgvllt-proxy-0 ns-niaem milvus-mgvllt proxy Running 0 600m / 600m 644245094400m / 644245094400m aks-cicdamdpool-12089392-vmss000003/10.224.0.8 Sep 11,2025 18:38 UTC+0800 milvus-mgvllt-querynode-0 ns-niaem milvus-mgvllt querynode Running 0 600m / 600m 644245094400m / 644245094400m aks-cicdamdpool-12089392-vmss000001/10.224.0.7 Sep 11,2025 18:38 UTC+0800 check pod status done connect unsupported engine type: milvus milvus-standalone-1.0.1 check ops status `kbcli cluster list-ops milvus-mgvllt --status all --namespace ns-niaem ` NAME NAMESPACE TYPE CLUSTER COMPONENT STATUS PROGRESS CREATED-TIME milvus-mgvllt-restart-bgk4p ns-niaem Restart milvus-mgvllt proxy,mixcoord,datanode,indexnode,querynode Succeed 5/5 Sep 11,2025 18:38 UTC+0800 check ops status done ops_status:milvus-mgvllt-restart-bgk4p ns-niaem Restart milvus-mgvllt proxy,mixcoord,datanode,indexnode,querynode Succeed 5/5 Sep 11,2025 18:38 UTC+0800 `kubectl patch -p '***"metadata":***"finalizers":[]***' --type=merge opsrequests.operations milvus-mgvllt-restart-bgk4p --namespace ns-niaem ` opsrequest.operations.kubeblocks.io/milvus-mgvllt-restart-bgk4p patched `kbcli cluster delete-ops --name milvus-mgvllt-restart-bgk4p --force --auto-approve --namespace ns-niaem ` OpsRequest milvus-mgvllt-restart-bgk4p deleted skip cluster HorizontalScaling Out 1 check component indexnode exists `kubectl get components -l app.kubernetes.io/instance=milvus-mgvllt,apps.kubeblocks.io/component-name=indexnode --namespace ns-niaem | (grep "indexnode" || true )` check cluster status before ops check cluster status done cluster_status:Running `kbcli cluster vscale milvus-mgvllt --auto-approve --force=true --components indexnode --cpu 600m --memory 0.6Gi --namespace ns-niaem ` OpsRequest milvus-mgvllt-verticalscaling-88h2p created successfully, you can view the progress: kbcli cluster describe-ops milvus-mgvllt-verticalscaling-88h2p -n ns-niaem check ops status `kbcli cluster list-ops milvus-mgvllt --status all --namespace ns-niaem ` NAME NAMESPACE TYPE CLUSTER COMPONENT STATUS PROGRESS CREATED-TIME milvus-mgvllt-verticalscaling-88h2p ns-niaem VerticalScaling milvus-mgvllt indexnode Running -/- Sep 11,2025 18:39 UTC+0800 check cluster status `kbcli cluster list milvus-mgvllt --show-labels --namespace ns-niaem ` NAME NAMESPACE CLUSTER-DEFINITION TERMINATION-POLICY STATUS CREATED-TIME LABELS milvus-mgvllt ns-niaem milvus Delete Updating Sep 11,2025 18:25 UTC+0800 app.kubernetes.io/instance=milvus-mgvllt,clusterdefinition.kubeblocks.io/name=milvus cluster_status:Updating cluster_status:Updating cluster_status:Updating cluster_status:Updating cluster_status:Updating cluster_status:Updating cluster_status:Updating check cluster status done cluster_status:Running check pod status `kbcli cluster list-instances milvus-mgvllt --namespace ns-niaem ` NAME NAMESPACE CLUSTER COMPONENT STATUS ROLE ACCESSMODE AZ CPU(REQUEST/LIMIT) MEMORY(REQUEST/LIMIT) STORAGE NODE CREATED-TIME milvus-mgvllt-datanode-0 ns-niaem milvus-mgvllt datanode Running 0 600m / 600m 644245094400m / 644245094400m aks-cicdamdpool-12089392-vmss000001/10.224.0.7 Sep 11,2025 18:38 UTC+0800 milvus-mgvllt-indexnode-0 ns-niaem milvus-mgvllt indexnode Running 0 600m / 600m 644245094400m / 644245094400m aks-cicdamdpool-12089392-vmss000003/10.224.0.8 Sep 11,2025 18:39 UTC+0800 milvus-mgvllt-mixcoord-0 ns-niaem milvus-mgvllt mixcoord Running 0 600m / 600m 644245094400m / 644245094400m aks-cicdamdpool-12089392-vmss000003/10.224.0.8 Sep 11,2025 18:38 UTC+0800 milvus-mgvllt-proxy-0 ns-niaem milvus-mgvllt proxy Running 0 600m / 600m 644245094400m / 644245094400m aks-cicdamdpool-12089392-vmss000003/10.224.0.8 Sep 11,2025 18:38 UTC+0800 milvus-mgvllt-querynode-0 ns-niaem milvus-mgvllt querynode Running 0 600m / 600m 644245094400m / 644245094400m aks-cicdamdpool-12089392-vmss000001/10.224.0.7 Sep 11,2025 18:38 UTC+0800 check pod status done connect unsupported engine type: milvus milvus-standalone-1.0.1 check ops status `kbcli cluster list-ops milvus-mgvllt --status all --namespace ns-niaem ` NAME NAMESPACE TYPE CLUSTER COMPONENT STATUS PROGRESS CREATED-TIME milvus-mgvllt-verticalscaling-88h2p ns-niaem VerticalScaling milvus-mgvllt indexnode Succeed 1/1 Sep 11,2025 18:39 UTC+0800 check ops status done ops_status:milvus-mgvllt-verticalscaling-88h2p ns-niaem VerticalScaling milvus-mgvllt indexnode Succeed 1/1 Sep 11,2025 18:39 UTC+0800 `kubectl patch -p '***"metadata":***"finalizers":[]***' --type=merge opsrequests.operations milvus-mgvllt-verticalscaling-88h2p --namespace ns-niaem ` opsrequest.operations.kubeblocks.io/milvus-mgvllt-verticalscaling-88h2p patched `kbcli cluster delete-ops --name milvus-mgvllt-verticalscaling-88h2p --force --auto-approve --namespace ns-niaem ` OpsRequest milvus-mgvllt-verticalscaling-88h2p deleted check component querynode exists `kubectl get components -l app.kubernetes.io/instance=milvus-mgvllt,apps.kubeblocks.io/component-name=querynode --namespace ns-niaem | (grep "querynode" || true )` cluster restart check cluster status before ops check cluster status done cluster_status:Running `kbcli cluster restart milvus-mgvllt --auto-approve --force=true --components querynode --namespace ns-niaem ` OpsRequest milvus-mgvllt-restart-b4d9k created successfully, you can view the progress: kbcli cluster describe-ops milvus-mgvllt-restart-b4d9k -n ns-niaem check ops status `kbcli cluster list-ops milvus-mgvllt --status all --namespace ns-niaem ` NAME NAMESPACE TYPE CLUSTER COMPONENT STATUS PROGRESS CREATED-TIME milvus-mgvllt-restart-b4d9k ns-niaem Restart milvus-mgvllt querynode Running -/- Sep 11,2025 18:40 UTC+0800 check cluster status `kbcli cluster list milvus-mgvllt --show-labels --namespace ns-niaem ` NAME NAMESPACE CLUSTER-DEFINITION TERMINATION-POLICY STATUS CREATED-TIME LABELS milvus-mgvllt ns-niaem milvus Delete Updating Sep 11,2025 18:25 UTC+0800 app.kubernetes.io/instance=milvus-mgvllt,clusterdefinition.kubeblocks.io/name=milvus cluster_status:Updating cluster_status:Updating cluster_status:Updating cluster_status:Updating cluster_status:Updating cluster_status:Updating cluster_status:Updating check cluster status done cluster_status:Running check pod status `kbcli cluster list-instances milvus-mgvllt --namespace ns-niaem ` NAME NAMESPACE CLUSTER COMPONENT STATUS ROLE ACCESSMODE AZ CPU(REQUEST/LIMIT) MEMORY(REQUEST/LIMIT) STORAGE NODE CREATED-TIME milvus-mgvllt-datanode-0 ns-niaem milvus-mgvllt datanode Running 0 600m / 600m 644245094400m / 644245094400m aks-cicdamdpool-12089392-vmss000001/10.224.0.7 Sep 11,2025 18:38 UTC+0800 milvus-mgvllt-indexnode-0 ns-niaem milvus-mgvllt indexnode Running 0 600m / 600m 644245094400m / 644245094400m aks-cicdamdpool-12089392-vmss000003/10.224.0.8 Sep 11,2025 18:39 UTC+0800 milvus-mgvllt-mixcoord-0 ns-niaem milvus-mgvllt mixcoord Running 0 600m / 600m 644245094400m / 644245094400m aks-cicdamdpool-12089392-vmss000003/10.224.0.8 Sep 11,2025 18:38 UTC+0800 milvus-mgvllt-proxy-0 ns-niaem milvus-mgvllt proxy Running 0 600m / 600m 644245094400m / 644245094400m aks-cicdamdpool-12089392-vmss000003/10.224.0.8 Sep 11,2025 18:38 UTC+0800 milvus-mgvllt-querynode-0 ns-niaem milvus-mgvllt querynode Running 0 600m / 600m 644245094400m / 644245094400m aks-cicdamdpool-12089392-vmss000001/10.224.0.7 Sep 11,2025 18:40 UTC+0800 check pod status done connect unsupported engine type: milvus milvus-standalone-1.0.1 check ops status `kbcli cluster list-ops milvus-mgvllt --status all --namespace ns-niaem ` NAME NAMESPACE TYPE CLUSTER COMPONENT STATUS PROGRESS CREATED-TIME milvus-mgvllt-restart-b4d9k ns-niaem Restart milvus-mgvllt querynode Succeed 1/1 Sep 11,2025 18:40 UTC+0800 check ops status done ops_status:milvus-mgvllt-restart-b4d9k ns-niaem Restart milvus-mgvllt querynode Succeed 1/1 Sep 11,2025 18:40 UTC+0800 `kubectl patch -p '***"metadata":***"finalizers":[]***' --type=merge opsrequests.operations milvus-mgvllt-restart-b4d9k --namespace ns-niaem ` opsrequest.operations.kubeblocks.io/milvus-mgvllt-restart-b4d9k patched `kbcli cluster delete-ops --name milvus-mgvllt-restart-b4d9k --force --auto-approve --namespace ns-niaem ` OpsRequest milvus-mgvllt-restart-b4d9k deleted cluster update terminationPolicy WipeOut `kbcli cluster update milvus-mgvllt --termination-policy=WipeOut --namespace ns-niaem ` cluster.apps.kubeblocks.io/milvus-mgvllt updated check cluster status `kbcli cluster list milvus-mgvllt --show-labels --namespace ns-niaem ` NAME NAMESPACE CLUSTER-DEFINITION TERMINATION-POLICY STATUS CREATED-TIME LABELS milvus-mgvllt ns-niaem milvus WipeOut Running Sep 11,2025 18:25 UTC+0800 app.kubernetes.io/instance=milvus-mgvllt,clusterdefinition.kubeblocks.io/name=milvus check cluster status done cluster_status:Running check pod status `kbcli cluster list-instances milvus-mgvllt --namespace ns-niaem ` NAME NAMESPACE CLUSTER COMPONENT STATUS ROLE ACCESSMODE AZ CPU(REQUEST/LIMIT) MEMORY(REQUEST/LIMIT) STORAGE NODE CREATED-TIME milvus-mgvllt-datanode-0 ns-niaem milvus-mgvllt datanode Running 0 600m / 600m 644245094400m / 644245094400m aks-cicdamdpool-12089392-vmss000001/10.224.0.7 Sep 11,2025 18:38 UTC+0800 milvus-mgvllt-indexnode-0 ns-niaem milvus-mgvllt indexnode Running 0 600m / 600m 644245094400m / 644245094400m aks-cicdamdpool-12089392-vmss000003/10.224.0.8 Sep 11,2025 18:39 UTC+0800 milvus-mgvllt-mixcoord-0 ns-niaem milvus-mgvllt mixcoord Running 0 600m / 600m 644245094400m / 644245094400m aks-cicdamdpool-12089392-vmss000003/10.224.0.8 Sep 11,2025 18:38 UTC+0800 milvus-mgvllt-proxy-0 ns-niaem milvus-mgvllt proxy Running 0 600m / 600m 644245094400m / 644245094400m aks-cicdamdpool-12089392-vmss000003/10.224.0.8 Sep 11,2025 18:38 UTC+0800 milvus-mgvllt-querynode-0 ns-niaem milvus-mgvllt querynode Running 0 600m / 600m 644245094400m / 644245094400m aks-cicdamdpool-12089392-vmss000001/10.224.0.7 Sep 11,2025 18:40 UTC+0800 check pod status done connect unsupported engine type: milvus milvus-standalone-1.0.1 cluster list-logs `kbcli cluster list-logs milvus-mgvllt --namespace ns-niaem ` No log files found. Error from server (NotFound): pods "milvus-mgvllt-proxy-0" not found cluster logs `kbcli cluster logs milvus-mgvllt --tail 30 --namespace ns-niaem ` Defaulted container "datanode" out of: datanode, setup (init) [2025/09/11 10:39:39.821 +00:00] [DEBUG] [sessionutil/session_util.go:249] ["Session connect to etcd success"] [2025/09/11 10:39:39.821 +00:00] [DEBUG] [sessionutil/session_util.go:566] ["SessionUtil GetSessions"] [prefix=datacoord] [key=datacoord] [address=10.244.2.26:13333] [2025/09/11 10:39:39.823 +00:00] [INFO] [datanode/service.go:285] ["DataCoord client is ready for DataNode"] [2025/09/11 10:39:39.823 +00:00] [INFO] [sync/once.go:74] ["DataNode server initializing"] [TimeTickChannelName=milvus-mgvllt-datacoord-timetick-channel] [2025/09/11 10:39:39.823 +00:00] [DEBUG] [sessionutil/session_util.go:234] ["Session try to connect to etcd"] [2025/09/11 10:39:39.824 +00:00] [DEBUG] [sessionutil/session_util.go:249] ["Session connect to etcd success"] [2025/09/11 10:39:39.828 +00:00] [DEBUG] [sessionutil/session_util.go:292] [getServerID] [reuse=true] [2025/09/11 10:39:39.832 +00:00] [DEBUG] [sessionutil/session_util.go:350] ["Session get serverID success"] [key=id] [ServerId=19] [2025/09/11 10:39:39.832 +00:00] [INFO] [sessionutil/session_util.go:266] ["start server"] [name=datanode] [address=10.244.3.216:21124] [id=19] [2025/09/11 10:39:39.832 +00:00] [INFO] [sessionutil/session_util.go:1167] ["save server info into file"] [content="datanode-19\n"] [filePath=/tmp/milvus/server_id_8] [2025/09/11 10:39:39.832 +00:00] [INFO] [datanode/data_node.go:244] ["DataNode server init rateCollector done"] ["node ID"=19] [2025/09/11 10:39:39.832 +00:00] [INFO] [datanode/data_node.go:247] ["DataNode server init dispatcher client done"] ["node ID"=19] [2025/09/11 10:39:39.832 +00:00] [INFO] [dependency/factory.go:84] ["try to init mq"] [standalone=false] [mqType=kafka] [2025/09/11 10:39:39.833 +00:00] [INFO] [datanode/data_node.go:260] ["DataNode server init succeeded"] [MsgChannelSubName=milvus-mgvllt-dataNode] [2025/09/11 10:39:39.833 +00:00] [INFO] [datanode/service.go:297] ["current DataNode state"] [state=Initializing] [2025/09/11 10:39:39.833 +00:00] [INFO] [datanode/service.go:192] ["DataNode gRPC services successfully initialized"] [2025/09/11 10:39:39.833 +00:00] [INFO] [datanode/data_node.go:316] ["start id allocator done"] [role=datanode] [2025/09/11 10:39:39.838 +00:00] [INFO] [storage/minio_chunk_manager.go:86] ["minio chunk manager init success."] [bucketname=test] [root=kubeblocks-milvus-mgvllt] [2025/09/11 10:39:39.838 +00:00] [DEBUG] [sessionutil/session_util.go:413] ["service begin to register to etcd"] [serverName=datanode] [ServerID=19] [2025/09/11 10:39:39.838 +00:00] [INFO] [datanode/data_node.go:295] ["DataNode Background GC Start"] [2025/09/11 10:39:39.838 +00:00] [INFO] [datanode/event_manager.go:48] ["Start watch channel"] [prefix=channelwatch/19] [2025/09/11 10:39:39.848 +00:00] [INFO] [sessionutil/session_util.go:442] ["put session key into etcd"] [key=milvus-mgvllt/meta/session/datanode-19] [value="***\"ServerID\":19,\"ServerName\":\"datanode\",\"Address\":\"10.244.3.216:21124\",\"TriggerKill\":true,\"Version\":\"2.3.2\",\"IndexEngineVersion\":***,\"LeaseID\":765217452936436792,\"HostName\":\"milvus-mgvllt-datanode-0\"***"] [2025/09/11 10:39:39.848 +00:00] [INFO] [sessionutil/session_util.go:452] ["Service registered successfully"] [ServerName=datanode] [serverID=19] [2025/09/11 10:39:39.848 +00:00] [INFO] [datanode/data_node.go:184] ["DataNode Register Finished"] [2025/09/11 10:39:39.849 +00:00] [INFO] [datanode/service.go:197] ["DataNode gRPC services successfully started"] [2025/09/11 10:39:39.849 +00:00] [DEBUG] [components/data_node.go:57] ["Datanode successfully started"] [2025/09/11 10:39:39.849 +00:00] [INFO] [logutil/logutil.go:163] ["Log directory"] [configDir=] [2025/09/11 10:39:39.849 +00:00] [INFO] [logutil/logutil.go:164] ["Set log file to "] [path=] [2025/09/11 10:39:39.849 +00:00] [INFO] [tracer/tracer.go:71] ["Init tracer finished"] [Exporter=stdout] [2025/09/11 10:39:39.915 +00:00] [INFO] [sessionutil/session_util.go:859] ["register session success"] [role=datanode] [key=milvus-mgvllt/meta/session/datanode-19] delete cluster milvus-mgvllt `kbcli cluster delete milvus-mgvllt --auto-approve --namespace ns-niaem ` Cluster milvus-mgvllt deleted pod_info:milvus-mgvllt-datanode-0 1/1 Running 0 3m19s milvus-mgvllt-indexnode-0 1/1 Running 0 103s milvus-mgvllt-mixcoord-0 1/1 Running 0 2m49s milvus-mgvllt-proxy-0 1/1 Running 0 2m48s milvus-mgvllt-querynode-0 1/1 Running 0 55s pod_info:milvus-mgvllt-mixcoord-0 1/1 Terminating 0 3m9s milvus-mgvllt-proxy-0 1/1 Terminating 0 3m8s No resources found in ns-niaem namespace. delete cluster pod done No resources found in ns-niaem namespace. check cluster resource non-exist OK: pvc No resources found in ns-niaem namespace. delete cluster done No resources found in ns-niaem namespace. No resources found in ns-niaem namespace. No resources found in ns-niaem namespace. Milvus Test Suite All Done! delete cluster etcdm-mgvllt `kbcli cluster delete etcdm-mgvllt --auto-approve --namespace ns-niaem ` Cluster etcdm-mgvllt deleted pod_info:etcdm-mgvllt-etcd-0 2/2 Running 0 74m No resources found in ns-niaem namespace. delete cluster pod done No resources found in ns-niaem namespace. check cluster resource non-exist OK: pvc No resources found in ns-niaem namespace. delete cluster done No resources found in ns-niaem namespace. No resources found in ns-niaem namespace. No resources found in ns-niaem namespace. delete cluster kafkam-mgvllt `kbcli cluster delete kafkam-mgvllt --auto-approve --namespace ns-niaem ` Cluster kafkam-mgvllt deleted pod_info:kafkam-mgvllt-kafka-combine-0 2/2 Running 0 73m No resources found in ns-niaem namespace. delete cluster pod done No resources found in ns-niaem namespace. check cluster resource non-exist OK: pvc No resources found in ns-niaem namespace. delete cluster done No resources found in ns-niaem namespace. No resources found in ns-niaem namespace. No resources found in ns-niaem namespace. delete cluster miniom-mgvllt `kbcli cluster delete miniom-mgvllt --auto-approve --namespace ns-niaem ` Cluster miniom-mgvllt deleted pod_info:miniom-mgvllt-minio-0 2/2 Running 0 72m miniom-mgvllt-minio-1 2/2 Running 0 66m pod_info:miniom-mgvllt-minio-0 2/2 Terminating 0 73m miniom-mgvllt-minio-1 2/2 Terminating 0 66m No resources found in ns-niaem namespace. delete cluster pod done No resources found in ns-niaem namespace. check cluster resource non-exist OK: pvc No resources found in ns-niaem namespace. delete cluster done No resources found in ns-niaem namespace. No resources found in ns-niaem namespace. No resources found in ns-niaem namespace. Test Engine: milvus Test Type: 28 [PASSED]|[Create]|[ComponentDefinition=etcd-3-1.0.1;ServiceVersion=3.6.1;]|[Description=Create a cluster with the specified component definition etcd-3-1.0.1 and service version 3.6.1] [PASSED]|[Create]|[ComponentDefinition=kafka;ServiceVersion=3.3.2;]|[Description=Create a cluster with the specified component definition kafka and service version 3.3.2] [SKIPPED]|[Create]|[Topology=combined;ServiceVersion=3.3.2]|[Description=Create a cluster with the specified topology combined and service version 3.3.2] [PASSED]|[Create]|[ComponentDefinition=minio-1.0.1;]|[Description=Create a cluster with the specified component definition minio-1.0.1] --------------------------------------Milvus (Topology = standalone Replicas 1) Test Result-------------------------------------- [PASSED]|[Create]|[ComponentDefinition=milvus-standalone-1.0.1;ComponentVersion=milvus;ServiceVersion=v2.3.2;]|[Description=Create a cluster with the specified component definition milvus-standalone-1.0.1 and component version milvus and service version v2.3.2] [PASSED]|[No-Failover]|[HA=Connection Stress;ComponentName=milvus]|[Description=Simulates conditions where pods experience connection stress either due to expected/undesired processes thereby testing the application's resilience to potential slowness/unavailability of some replicas due to high Connection load.] [PASSED]|[VolumeExpansion]|[ComponentName=etcd]|[Description=VolumeExpansion the cluster specify component etcd] [PASSED]|[VerticalScaling]|[ComponentName=etcd]|[Description=VerticalScaling the cluster specify component etcd] [PASSED]|[VolumeExpansion]|[ComponentName=milvus]|[Description=VolumeExpansion the cluster specify component milvus] [PASSED]|[Restart]|[ComponentName=milvus]|[Description=Restart the cluster specify component milvus] [PASSED]|[VerticalScaling]|[ComponentName=milvus]|[Description=VerticalScaling the cluster specify component milvus] [PASSED]|[VolumeExpansion]|[ComponentName=minio]|[Description=VolumeExpansion the cluster specify component minio] [PASSED]|[Stop]|[-]|[Description=Stop the cluster] [PASSED]|[Start]|[-]|[Description=Start the cluster] [PASSED]|[VerticalScaling]|[ComponentName=minio]|[Description=VerticalScaling the cluster specify component minio] [PASSED]|[Restart]|[-]|[Description=Restart the cluster] [PASSED]|[Update]|[TerminationPolicy=WipeOut]|[Description=Update the cluster TerminationPolicy WipeOut] [PASSED]|[Delete]|[-]|[Description=Delete the cluster] --------------------------------------Milvus (Topology = cluster Replicas 1) Test Result-------------------------------------- [PASSED]|[Create]|[ComponentDefinition=milvus-standalone-1.0.1;ComponentVersion=milvus;ServiceVersion=v2.3.2;]|[Description=Create a cluster with the specified component definition milvus-standalone-1.0.1 and component version milvus and service version v2.3.2] [PASSED]|[VerticalScaling]|[ComponentName=querynode]|[Description=VerticalScaling the cluster specify component querynode] [PASSED]|[Stop]|[-]|[Description=Stop the cluster] [PASSED]|[Start]|[-]|[Description=Start the cluster] [PASSED]|[No-Failover]|[HA=Connection Stress;ComponentName=proxy]|[Description=Simulates conditions where pods experience connection stress either due to expected/undesired processes thereby testing the application's resilience to potential slowness/unavailability of some replicas due to high Connection load.] [PASSED]|[VerticalScaling]|[ComponentName=datanode,mixcoord,proxy]|[Description=VerticalScaling the cluster specify component datanode,mixcoord,proxy] [PASSED]|[Restart]|[-]|[Description=Restart the cluster] [PASSED]|[VerticalScaling]|[ComponentName=indexnode]|[Description=VerticalScaling the cluster specify component indexnode] [PASSED]|[Restart]|[ComponentName=querynode]|[Description=Restart the cluster specify component querynode] [PASSED]|[Update]|[TerminationPolicy=WipeOut]|[Description=Update the cluster TerminationPolicy WipeOut] [PASSED]|[Delete]|[-]|[Description=Delete the cluster] [END]