Parsing uploaded file using DTO as type for msgspec #4071
Unanswered
alaws-USGS
asked this question in
Q&A
Replies: 2 comments 1 reply
-
I've been doing some more digging into However, I feel there should be a way to use a created DTO and use something like this to perform the parsing: # controller path definition
content = data.read()
decoded_content = msgspec.json.decode(content, type=InputSiteInfoDTO) Am I getting closer to a solution? Thanks! |
Beta Was this translation helpful? Give feedback.
0 replies
-
You should be able to use @post(path="/input_site_infos", response_model=List[InputSiteInfoDTO], tags=["Input Site Info"])
async def insert_input_site_infos(self,
input_site_info_service: InputSiteInfoService,
data: Annotated[UploadFile, Body(media_type=RequestEncodingType.MULTI_PART)],
request: Request
) -> List[InputSiteInfo]:
"""Insert multiple site info records from a file into the database via file."""
content = await data.read()
# read_file is a custom function that calls csv.DictReader and returns the data all as strings
file_rows = read_file(content, delimiter="\t") # Use TSV delimiter, change to ',' for CSV if needed
inserted_sites = []
# bind the DTO to the current request; This is necessary so the DTO backend
# has access the correct decoders defined for this handler
bound_dto = InputSiteInfoDTO(request)
# Process each row from the file and insert into DB using the service
for row in file_rows:
# create a DTO instance from the data
station_dto = bound_dto.decode_builtins(row)
inserted_site = await input_site_info_service.create(station_dto) # Insert into DB (simulated)
inserted_sites.append(input_site_info_service.to_schema(inserted_site)) # Convert to schema and add to result
# Return the result as a list of inserted sites
return inserted_sites |
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
Hello, I am working on an endpoint to parse a custom formatted config file into and commit it to a database and admittedly still learning about Litestar. All of the data is read in as strings with a custom parse function
read_file
, which returns a dictionary of all the data. I then want to commit each row of that parsed data to our database. I am using the SQLAlchemy Plugin and SQLAlchemy models for validation.We currently have a DTO called
InputSiteInfoDTO
that has all the fields we expect from the uploaded file. However, we want to have the file contents be parsed from string to the correct type, something that thepydantic.prase_obj()
method does. Here is my current implementation that uses a Pydantic model,InputSiteInfoPydantic
.Would it be possible to get this coercion using the
decode_builtins
method of DTOs? Am I try to do too much and willlitestar
or theinput_site_info_service
handle this (below is my service definition)?Thanks and I appreciate your eyes on this!
Beta Was this translation helpful? Give feedback.
All reactions