Re: vote on sparklyr 1.2 release branch

Samuel Victor Medeiros de Macedo - IFPE - Campus Recife <samuelmacedo@...>

I believe this release is robust. I vote +1

Em qui, 16 de abr de 2020 02:25, Javier Luraschi <javier@...> escreveu:

Spent some time this week testing and looking at the bulk of changes for this release. Mostly testing in Amazon EMR, Databricks and Azure with Livy which we currently don't have test coverage for in the project itself.

+1 to release this version from me and +1 from Yitao since he is proposing this release.

I believe we would want more than half of the committers to agree to release a version, so 4 votes out of 6 committers currently. I would also encourage us to discuss any -1 votes and do our best to convert them to +1. Ideally, we should try to never release a version with -1 votes.

Any other committers want to add their +1 or -1 to this release?

Yitao is planning to submit to CRAN in the next day or so.

On Tue, Apr 14, 2020 at 8:12 AM Yitao Li <yitao@...> wrote:
Hello, sparklyr contributors:
    Hope you are as excited about releasing sparklyr 1.2 as I am. Please take a look and vote on the following release candidate:
    Assuming there is no last minute change required and no objection, then we'll go ahead with releasing sparklyr 1.2 on April 20th, 2020 as planned.

On Wed, Apr 1, 2020 at 5:59 PM Javier Luraschi <javier@...> wrote:
sparklyr committers and public technical mailing list,

It seems like a good time to start preparing the sparklyr 1.2 release. The Spark 3.0 is feature complete and the two other major features: Integration with the parallel package and Databricks Connect are also feature complete.

Notice that there is a release candidate for Spark 3.0 being proposed but might take a couple RC iterations to ship:
Are there any objections to start wrapping up this release and release in 1-3 weeks?

Otherwise, Yitao Li has volunteered to start preparing this new release and will send a release-to-CRAN vote email to the committers and this sparklyr-technical-discuss@... DL soon.


Join to automatically receive all group messages.