Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Original file line number Diff line number Diff line change
Expand Up @@ -3,6 +3,7 @@
package dk.cachet.carp.data.application

import dk.cachet.carp.common.application.data.Data
import dk.cachet.carp.common.application.data.DataType
import kotlinx.serialization.*
import kotlinx.serialization.builtins.*
import kotlinx.serialization.descriptors.*
Expand Down Expand Up @@ -90,7 +91,7 @@ class MutableDataStreamBatch : DataStreamBatch
/**
* Append all data stream sequences contained in [batch] to this batch.
*
* @throws IllegalArgumentException when the start of any of the sequences contained in [batch]
* @throws IllegalArgumentException when the start of the sequences contained in [batch]
* precede the end of a previously appended sequence to the same data stream.
*/
fun appendBatch( batch: DataStreamBatch )
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -2,10 +2,13 @@ package dk.cachet.carp.data.application


import dk.cachet.carp.common.application.UUID
import dk.cachet.carp.common.application.data.DataType
import dk.cachet.carp.common.application.services.ApiVersion
import dk.cachet.carp.common.application.services.ApplicationService
import dk.cachet.carp.common.application.services.IntegrationEvent
import kotlinx.serialization.*
import kotlinx.datetime.Instant
import kotlinx.serialization.Required
import kotlinx.serialization.Serializable
Comment on lines +10 to +11
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't think it's configured, but wildcard is used for serialization imports across the codebase.



/**
Expand Down Expand Up @@ -60,6 +63,33 @@ interface DataStreamService : ApplicationService<DataStreamService, DataStreamSe
toSequenceIdInclusive: Long? = null
): DataStreamBatch

/**
* Retrieve data across multiple study deployments, optionally filtered by device role names,
* data types, and time range.
*
* The response is a canonical [DataStreamBatch]: for each [DataStreamId], sequences are
* ordered by start time and non-overlapping (contract preserved). No derived/secondary
* indexing is applied in this API; analytics-specific projections are out of scope here.
Comment on lines +70 to +72
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Drop this; all of this are implementation/design details. Not API documentation. The contract (API) of DataStreamBatch is documented already on DataStreamBatch.

*
* Time range semantics: if [from] or [to] are specified, sequences are clipped to the
* half-open interval [from, to) (inclusive start, exclusive end).
Comment on lines +74 to +75
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

That only works if from and to are specified. But, it looks like you can omit this and instead just document inclusive/exclusive nature in the corresponding from/to parameters. As is, this causes more confusion than it answers edge cases.

Instead, I'm more surprised about how Instant comes into the picture here. The data subsystem only has Long's for sensorStartTime and sensorEndTime. So ... what is happening here? How do I know what to pass?

*
* @param studyDeploymentIds Study deployments to query. Must not be empty.
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Must not be empty.

Why? It seems like an overly strict contract. You can easily return nothing if you pass nothing, which would cause less additional handling for this edge case by the caller if they don't care about optimization/saving a roundtrip.

* @param deviceRoleNames Optional device role name filter (e.g., "phone"). If null or empty, all are included.
* @param dataTypes Optional data type filter. If null or empty, all are included.
Comment on lines +78 to +79
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If null or empty, all are included.

That's counter intuitive. If empty, just filter out everything. A good API doesn't give two ways to do the same thing.

On why it matters: suppose a caller sets up a dynamic filter determining the set of device role names they are interested in, which ends up being empty. Now the caller will get all data, instead of no data, as expected.

* @param from Optional absolute start time (inclusive). If null, no lower bound.
* @param to Optional absolute end time (exclusive). If null, no upper bound.
* @return A [DataStreamBatch] containing matching data sequences, preserving per-stream invariants.
*/
suspend fun getBatchForStudyDeployments(
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm not certain about the naming of this. Maybe simply getData? But, it will depend a bit on what actually comes out. It still seems like some synchronization is bound to happen (which would need documentation!), given the from and to Instant parameters, in which case getSynchronizedData or similar could be more appropriate.

studyDeploymentIds: Set<UUID>,
deviceRoleNames: Set<String>? = null,
dataTypes: Set<DataType>? = null,
from: Instant? = null,
to: Instant? = null
): DataStreamBatch


/**
* Stop accepting incoming data for all data streams for each of the [studyDeploymentIds].
*
Expand Down
Original file line number Diff line number Diff line change
@@ -1,13 +1,15 @@
package dk.cachet.carp.data.infrastructure

import dk.cachet.carp.common.application.UUID
import dk.cachet.carp.common.application.data.DataType
import dk.cachet.carp.common.infrastructure.services.ApplicationServiceDecorator
import dk.cachet.carp.common.infrastructure.services.ApplicationServiceInvoker
import dk.cachet.carp.common.infrastructure.services.Command
import dk.cachet.carp.data.application.DataStreamBatch
import dk.cachet.carp.data.application.DataStreamId
import dk.cachet.carp.data.application.DataStreamService
import dk.cachet.carp.data.application.DataStreamsConfiguration
import kotlinx.datetime.Instant


class DataStreamServiceDecorator(
Expand All @@ -32,6 +34,22 @@ class DataStreamServiceDecorator(
toSequenceIdInclusive: Long?
) = invoke( DataStreamServiceRequest.GetDataStream( dataStream, fromSequenceId, toSequenceIdInclusive ) )

override suspend fun getBatchForStudyDeployments(
studyDeploymentIds: Set<UUID>,
deviceRoleNames: Set<String>?,
dataTypes: Set<DataType>?,
from: Instant?,
to: Instant?
) = invoke(
DataStreamServiceRequest.GetBatchForStudyDeployments(
studyDeploymentIds,
deviceRoleNames,
dataTypes,
from,
to
)
)

override suspend fun closeDataStreams( studyDeploymentIds: Set<UUID> ) =
invoke( DataStreamServiceRequest.CloseDataStreams( studyDeploymentIds ) )

Expand All @@ -51,5 +69,12 @@ object DataStreamServiceInvoker : ApplicationServiceInvoker<DataStreamService, D
service.getDataStream( dataStream, fromSequenceId, toSequenceIdInclusive )
is DataStreamServiceRequest.CloseDataStreams -> service.closeDataStreams( studyDeploymentIds )
is DataStreamServiceRequest.RemoveDataStreams -> service.removeDataStreams( studyDeploymentIds )
is DataStreamServiceRequest.GetBatchForStudyDeployments -> service.getBatchForStudyDeployments(
studyDeploymentIds,
deviceRoleNames,
dataTypes,
from,
to
)
}
}
Original file line number Diff line number Diff line change
@@ -1,6 +1,7 @@
package dk.cachet.carp.data.infrastructure

import dk.cachet.carp.common.application.UUID
import dk.cachet.carp.common.application.data.DataType
import dk.cachet.carp.common.application.services.ApiVersion
import dk.cachet.carp.common.infrastructure.serialization.ignoreTypeParameters
import dk.cachet.carp.common.infrastructure.services.ApplicationServiceRequest
Expand All @@ -9,6 +10,7 @@ import dk.cachet.carp.data.application.DataStreamBatchSerializer
import dk.cachet.carp.data.application.DataStreamId
import dk.cachet.carp.data.application.DataStreamService
import dk.cachet.carp.data.application.DataStreamsConfiguration
import kotlinx.datetime.Instant
import kotlinx.serialization.*
import kotlin.js.JsExport

Expand Down Expand Up @@ -54,15 +56,25 @@ sealed class DataStreamServiceRequest<out TReturn> : ApplicationServiceRequest<D
}

@Serializable
data class CloseDataStreams( val studyDeploymentIds: Set<UUID> ) :
DataStreamServiceRequest<Unit>()
data class GetBatchForStudyDeployments(
val studyDeploymentIds: Set<UUID>,
val deviceRoleNames: Set<String>? = null,
val dataTypes: Set<DataType>? = null,
val from: Instant? = null,
val to: Instant? = null
) : DataStreamServiceRequest<DataStreamBatch>()
{
override fun getResponseSerializer() = DataStreamBatchSerializer
}

@Serializable
data class CloseDataStreams( val studyDeploymentIds: Set<UUID> ) : DataStreamServiceRequest<Unit>()
{
override fun getResponseSerializer() = serializer<Unit>()
}

@Serializable
data class RemoveDataStreams( val studyDeploymentIds: Set<UUID> ) :
DataStreamServiceRequest<Set<UUID>>()
data class RemoveDataStreams( val studyDeploymentIds: Set<UUID> ) : DataStreamServiceRequest<Set<UUID>>()
{
override fun getResponseSerializer() = serializer<Set<UUID>>()
}
Expand Down
Loading