-
Notifications
You must be signed in to change notification settings - Fork 1.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Desired features/changes for Spark 3.0 #1105
Comments
Please consider reopening the issues labeled |
@fwgreen I'll go through them and check if there are any that shouldn't have been closed. |
Native support for uploaded files instead of getting the raw request. Multiple static file locations |
Would it be possible to add internal metrics (usages, perfs, custom) to answer my personal needs of control :) More seriously, in a world of containers, metrics are mandatory to monitor services. Maybe a look at microprofile-metrics and their annotations could inspire developers ? A /metrics output with a prometheus format (anyway something standard) would be a must ;) |
@RyanSusana I was wondering the same for static files (#568) Could you explain more about your use case? |
@johnnybigoode One for the JS/CSS and one for /uploads or something This would allow me to split my application up better. For my specific use-case: Right now how I solve it, is that I traverse the classpath/jar and add a route for every file I have |
I have two ideas and if there is interest, I could try to provide pull requests.
Once somewhere
In your routes
|
@RyanSusana @mcgivrer @laliluna |
CSRF Tokens would be a nice simple feature. I use them for Single Page Web Apps, storing them in sessions. Normally, in other languages, there are standalone libraries or packages that provide this functionality to be used with any framework. In the Java world, CSRF tokens are either already integrated into other frameworks (Spring Security for example) or are part of old packages that are no longer being maintained or have more complex configurations in XML that, frankly, I don't understand how to set up. Do you think this is something that could be added in? or do you happen to know of a library that I can pick up that has little to no configuration and is standalone? I tried searching Maven Central but no luck. |
Request: Method to respond with a |
One thing that might be useful is the option to use Jax-RS style annotations on routes. This way, instead of reaching into the request object and grabbing seemingly random fields, you can define the expected inputs via annotations. If there's any interest in this, we've already developed something we use internally. I could spin it out into a PR easily! |
A big thing that would be nice to have is OpenApi/Swagger support, or a plugin/maven package to add support. Most frameworks out there have this to autogenerate open api specs and have swagger UI integrated, it makes testing and auto generating interfaces from the spec for your api's really awesome! |
Note that I have done a APT based code generation project for Javalin and would look to do the same for Spark. The Javalin one is documented at: https://dinject.io/docs/javalin/ ... I just need to adapt the code generation for Spark request/response.
As part of the APT code generation for controllers it also generates OpenApi/Swagger docs. The nice thing here is that APT has access to javadoc/kotlindoc so actually we just javadoc our controller methods and that goes into the generated swagger. This approach is more similar to the jax-rs style with dependency injection and controllers. Note that the DI also uses APT code generation so it is fast and light (but people could swap it out for slower heavier DI like Guice or Spring if they wanted to). |
A way for the Spark.get("/", "test.example.com", (request, response) -> {
return "Hello!";
} |
Thanks everyone for your suggestions. It's been a long summer vacation with a resulting dip in project activity. Ramping up will begin within a month! |
I am just now working on my first project with Spark and I like its minimalism, as time goes I will probably find more things, but these are some features I found missing early in development:
These are not deal-breakers, so I continue development and it's really good so far. |
Please add option to disable GZip in staticFiles response. |
Allow other embeddable servers will be great!! |
A little late to the game but here are some improvements I'd like to suggest. I ran into this hurdles when I used sparkjava to implement a basic REST service that only had a few endpoints. Overall experience was great and I loved the simplicity of sparkjava.
That's about it. Appreciate all the hard work and if these suggestions sound interesting I think I'd be able to submit some patches if given some direction. |
An option to disable automatic gzip compression based on the presence of a |
I ran into that same issue TODAY. How did you solve it? |
@RyanSusana I posted my (grotesque) workaround on Stack Overflow. |
plugin system like in javalin will make spark extensible. Then creating plugin for common task like
|
GraphQL would be very nice |
A response type transformer as I've written in detail in #1181 |
Will 3.0 be release? |
What about http2 support according to pr #1183 ? Also is there some release plan for 3.0 ? |
Already implemented in the Unofficial Build among with other features. As far as I know @perwendel is planning to come back and keep going with this project, but meanwhile, I'm merging and fixing what I can in that repository. |
Hi,
A 2.9.0 release will be done shortly and after that my work will be fully focused on 3.0.
Any input on what would be fitting for Spark 3.0 is much appreciated. Please post in this thread.
Thanks!
The text was updated successfully, but these errors were encountered: